A recent performance analysis of the MongoDB database requires that the database be pressurized.
When pressurized, the initial use of the threading module to write a multi-thread, test results are not ideal.
The number of requests per second for a single read database can only reach 1000 s/s. The number of Java program requests developed can reach 6000-7000 times/s.
Proving that the multithreading performance constrained by Gil,python is not ideal.
Later, using the multiprocessing module, the multi-process approach to pressure.
Tests have shown that multiprocessing's performance is good, and the performance of developing Java programs is comparable.
The script is as follows:
#!/usr/bin/env python fromPymongo Import connection,mongoclient,mongoreplicasetclientimport multiprocessingimport time#connection= Mongoclient ('mongodb://10.120.11.212:27017/') #connection= Connection (['10.120.11.122','10.120.11.221','10.120.11.212'],27017)" "The database uses the read and write separation settings, the mode to connect MongoDB to pair" "Connection=Mongoreplicasetclient ('10.120.11.122:27017,10.120.11.221:27017,10.120.11.212:27017', Replicaset='Rs0', Read_preference=3# read_preference=3) DB= connection['CMS']db.authenticate ('CMS','CMS') #计时器def Func_time (func): Def _wrapper (*args,**Kwargs): Start=time.time () func (*args,**Kwargs) Print func.__name__,'Run:', Time.time ()-Startreturn_wrapper# Insert test method def insert (num): Posts=Db.userinfo forXinchrange (num): Post= {"_id": Str (x),"author": Str (x),"text":"My First Blog post!"} posts.insert (POST) #查询测试方法def query (num):Get=Db.device forIinchxrange (num):Get. Find_one ({"Scanid":"010000138101010000009aaaaa"}) @func_timedef Main (process_num,num): Pool= multiprocessing. Pool (processes=process_num) forIinchxrange (num): Pool.apply_async (query, (num)) Pool.close () pool.join () print"sub-process (es) done."if__name__ = ="__main__": # query ( -,1) Main ( -, -)
Originally published in Http://www.cnblogs.com/reach296/
Python Stress test for MongoDB