Python memory management analysis and python Memory Management
This article analyzes in detail the python memory management mechanism. Share it with you for your reference. The specific analysis is as follows:
Memory Management is a crucial part of a dynamic language such as Python. It even determines the execution efficiency of Python to a large extent, because in the running of Python, will create and destroy a large number of objects, these involve memory management.
Memory Pool of small block space
In Python, many times the applied memory is small memory. These small memory will be released soon after the application exists. Because these memory applications are not used to create objects, therefore, there is no memory pool mechanism at the object level.
Python memory pool Panorama
This means that Python will execute a lot of malloc and free operations during the runtime, and switch between the user State and the core State frequently, which seriously affects the execution efficiency of Python. To accelerate the execution efficiency of Python, Python introduces a memory pool mechanism to manage the application and release of small memory blocks. This is also the Pymalloc mechanism mentioned earlier.
In Python 2.5, the default boundary between small memory and large memory in Python is set to 256 bytes. This boundary point is controlled by the SMALL_REQUEST_THRESHOLD symbol we saw earlier.
That is to say, when the applied memory is less than 256 bytes, PyObject_Malloc will apply for memory in the memory pool; when the applied memory is greater than 256 bytes, PyObject_Malloc will be converted to malloc. Of course, by modifying the Python source code, we can change this default value to change the default memory management behavior of Python.
When the reference count of an object is reduced to 0, the corresponding destructor of the object will be called.
However, it is important to note that calling the Destructor does not necessarily mean that free will be called to release the memory space. If so, the frequent application and release of memory space will greatly reduce the execution efficiency of Python (not to mention the fact that Python has been suffering from dissatisfaction with its execution efficiency for many years ). In general, Python uses a large amount of Memory Object pool technology. Using this technology can avoid frequent application and release of memory space. Therefore, during the analysis, the space occupied by objects is usually returned to the memory pool.
"This problem is: the Python arena never releases the pool. Why is this problem similar to memory leakage. In this case, we apply for 10*1024*1024 small 16-byte memory, which means we must use MB of memory, because Python does not open the WITH_MEMORY_LIMITS compilation symbol of the memory pool restricted mentioned above by default, Python will use arena completely to meet your needs, which is no problem, the key problem is that after a while, you have released all the 16-byte memory and the memory has been back to arena control. It seems that there is no problem.
But the problem occurs exactly at this moment. Arena never releases the pool set maintained by it, so the MB of memory is always occupied by Python. If the program runs in the future, it will no longer need such a huge memory of MB, isn't the memory wasted? "
Python memory management rules: When del is used, the list elements are released, and the large objects of the management elements are recycled to The py Object Buffer Pool.
I hope this article will help you with Python programming.