Python uses the Meliae Analyzer memory Footprint Instance _python

Source: Internet
Author: User

Written by the DHT protocol search procedures, these days optimized to find that the speed is indeed much faster. But there was a new problem, the memory soared, and I opened 10 crawlers to occupy the memory 800m. At first I thought it was too many nodes, I found a few minor problems to modify, and find it useless. Then I went online to find the Python memory analysis tool, I got a little bit of data. Found Python has a meliae library operation is very convenient, the use of analysis, found that is not too many nodes for the reason 0 0, is to save the sent t_id, used to mark the return of the message is the issue of a dictionary too big.

It is easy to analyze the number and size of an object from the results of the analysis. I started to think it was because a lot of send query information, the opposite did not return to cause this dictionary elements are not released, I used to judge the expiration time, to delete the expiration. The discovery was small, but not very significant, as if it were less than dozens of 100M. Then reduced the search for a random hash of the time, before the 1-minute check once, I changed to be the first time check! , and found no reduction of 0 0. I don't know what the reason is. Should be looking for hash, ask the node, and then go back and ask the inside node, the last number of more and more, but I do not understand how to run so much more than a minute to have 600,000. That is to say, there are so many objects in memory that were not released. When this memory footprint is reached, the basic will no longer change, there is a very small slow ascension, because other programs open, not sure whether the other objects of these programs caused by the increase. Divide-and-phase dump test.

Install direct pip Install meliae OK, I see a long time no update of the project, do not know whether there is a good alternative but it is not bad.

Dump memory to a file

Copy Code code as follows:

From Meliae Import scanner
Scanner.dump_all_objects ('/tmp/dump%s.txt '% time.time ())

Analysis file:
Copy Code code as follows:

From meliae Import Loader
#加载dump文件
Om = loader.load ('/opt/log/dump.txt ')
#计算各Objects的引用关系
Om.compute_parents ()
#去掉各对象Instance的_dict_属性
Om.collapse_instance_dicts ()
#分析内存占用情况
Om.summarize ()

The field meaning is as follows:
Index: Row Indexing number
Count: Total number of objects of this type
% (Count): The total number of objects of this type as a percentage of the total number of objects of all types
Size: The total number of bytes for this type of object
% (Size): The total number of bytes of this type as a percentage of the total number of bytes in all types of objects
Cum:% (Size) after the cumulative row index
Max: The maximum number of bytes in an object of this type
Kind: Type

Analyze an object to find its referential relationship

Copy Code code as follows:

#得到所有的POP3ClientProtocol对象
p = om.get_all (' Pop3clientprotocol ')
#查看第一个对象
P[0]
#可以查看该对象的所有引用
P[0].c
#查看谁引用了这个对象
P[0].p

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.