Python 3 Concurrent Programming Multi-process process synchronization (lock)
Data sharing between processes, but sharing the same file system, so access to the same file, or the same print terminal, there is no problem, the result of competition is the confusion, how to control, is to lock processing.
1. Multiple processes sharing the same print terminal
fromMultiprocessingImportProcessImportOs,timedefWork ():Print('%s is running'%os.getpid ()) Time.sleep (2) Print('%s is done'%os.getpid ())if __name__=='__main__': forIinchRange (3): P=process (target=Work ) P.start ()concurrent operation, high efficiency, but competing for the same print terminal, resulting in print confusion
fromMultiprocessingImportProcess,lockImportOs,timedefWork (Lock): Lock.acquire ()Print('%s is running'%os.getpid ()) Time.sleep (2) Print('%s is done'%os.getpid ()) lock.release ()if __name__=='__main__': Lock=Lock () forIinchRange (3): P=process (target=work,args=(lock,)) P.start ()from concurrency into serial, sacrificing operational efficiency, but avoiding competition
2. Multiple processes sharing the same file
File as database, mock Rob ticket
#the contents of the file db are: {"Count": 1}#Be sure to use double quotes, otherwise JSON will not recognize fromMultiprocessingImportProcess,lockImportTime,json,randomdefsearch (): DiC=json.load (Open ('Db.txt')) Print('\033[43m Number of votes left%s\033[0m'%dic['Count'])defget (): DiC=json.load (Open ('Db.txt')) Time.sleep (0.1)#Network latency for analog read data ifdic['Count'] >0:dic['Count']-=1Time.sleep (0.2)#network latency for simulating write dataJson.dump (Dic,open ('Db.txt','W')) Print('\033[43m Purchase ticket success \033[0m')defTask (Lock): Search () get ()if __name__=='__main__': Lock=Lock () forIinchRange (100):#simulate concurrent 100 clients to rob a ticketP=process (target=task,args=(lock,)) P.start ()concurrent operation, high efficiency, but competing to write the same file, data write confusion
#the contents of the file db are: {"Count": 1}#Be sure to use double quotes, otherwise JSON will not recognize fromMultiprocessingImportProcess,lockImportTime,json,randomdefsearch (): DiC=json.load (Open ('Db.txt')) Print('\033[43m Number of votes left%s\033[0m'%dic['Count'])defget (): DiC=json.load (Open ('Db.txt')) Time.sleep (0.1)#Network latency for analog read data ifdic['Count'] >0:dic['Count']-=1Time.sleep (0.2)#network latency for simulating write dataJson.dump (Dic,open ('Db.txt','W')) Print('\033[43m Purchase ticket success \033[0m')defTask (Lock): Search () Lock.acquire () Get () lock.release ()if __name__=='__main__': Lock=Lock () forIinchRange (100):#simulate concurrent 100 clients to rob a ticketP=process (target=task,args=(lock,)) P.start ()Locking: Ticket purchase behavior from concurrency to serial, sacrificing operational efficiency, but to ensure data security
Lock can ensure that multiple processes modify the same piece of data, only one task can be modified at the same time, that is, serial modification, yes, the speed is slow, but at the expense of speed to ensure the data security.
Although you can use file sharing data to achieve interprocess communication, the problem is:
1. Low efficiency
2. Need to lock the handle yourself
For this purpose the Mutiprocessing module provides us with a message-based IPC communication mechanism: queues and pipelines.
1 both the queue and the pipeline are storing the data in memory
The 2 queue is also based on (pipe + lock) implementation, which allows us to free ourselves from complex lock problems,
We should try to avoid using shared data, use messaging and queues whenever possible, avoid complex synchronization and locking problems, and often get better malleable when the number of processes increases.
Python 3 Concurrent Programming Multi-process process synchronization (lock)