Tag: Lock time SSI SYN AMP LLB for role while
producer Consumer Model: 1 There are two types of roles in the process : responsible for the production data (producer), which is responsible for processing the data (consumer)2 into the producer consumer model, to balance producers and Speed difference between consumers 3 How to achieve: producer ---queue----consumer (decoupling)
1 Joinablequeue:
fromMultiprocessingImportProcess,joinablequeue ImportTime,randomdefproducer (Name,q,food): forIinchRange (1,10): Time.sleep (0.2) Res='%s%s was made'%(Name,i,food) q.put (res) q.join () # wait until there's no content in the queuedefConsumer (name,q): whileTrue:res=Q.get ()if notRes: Break Print('%s ate%s'%(Name,res)) q.task_done () # take another number minus one if __name__=='__main__': q = joinablequeue () p1=process (target=producer,args= ('Egon'Q'Baozi')) P2=process (target=producer,args= ('Alex'Q'Baozi')) P3=process (target=producer,args= ('Elen'Q'Baozi')) C1=process (target=consumer,args= ('a', q))#shared QC2=process (target=consumer,args= ('b', q))#shared QP1.start () P2.start () P3.start () C1.daemon=True C2.daemon=True # Consumer with producer execution End daemon---c1.start () c2.start () P1.join () P2.join () P3.join ()
2. Manager:
fromMultiprocessingImport Manager, Process,lockdefWork (D,lock): With lock:d['Count']-=1if __name__=='__main__':
m = Manager () D =m.dict ({'count': ') # Manager creates a shared dictionary #d=m.list ()lock=Lock () p_l=[] forIinchRange (20): P=process (target=work,args=(D,lock)) P_l.append (P) p.start () forPinchP_l:p.join ()Print(d)
3. Synchronous asynchronous---Blocking non-blocking:
Synchronous Call ---refers to the way the task is submitted = = Apply submit late task wait for task end
Blocking (a state of a process) Blocking---Encountering IO-- deprivation of CPU execution permissions asynchronous call --- after a task is submitted and will not wait in place to continue submitting the next task
non-blocking (The process is in a running state or in a ready state)
4. Process Pool:
===================== Process Pool ==================not high concurrency-----cannot be used under large concurrency fromMultiprocessingImportPool#number of open process control processesImportTime,os,randomdefWork (n):Print('%s I working'%os.getpid ()) Time.sleep (Random.randint ())#Blocking Time returnNif __name__=='__main__': P=pool (4)#Four process process pools There are only four processes---one task to complete and then another.Obj_ls=[] forIinchRange (10): #res=p.apply (work,args= (i)) # Synchronous call waits for the end of the task to get results----Submit the START Process Task p= process (target=work)--P.start () #Print (res)Obj=p.apply_async (work,args= (i,))#Asynchronous Invocation---Only commits the task continuously to the process pool open process----does not take the resultobj_ls.append (obj)#print (Obj.get ()) # Wait for the resultp.close ()#Close Apply_async RequestP.join ()#wait for process pool to end forObjinchObj_ls:Print(Obj.get ())
5 Back-tone function:
ImportRequestsImportOS fromMultiprocessingImport Pool, Process
defget (URL):Print('%s Get%s'%(Os.getpid (), URL)) Response=requests.get (URL)ifresponse.status_code==200: return{'URL': URL,'text': Response.text}defParse (data):Print(Os.getpid (), data) Res='%s:%s\n'% (data['URL'],len (data['text'])) with open ('Demo.txt','a') as F:f.write (res)if __name__=='__main__': URLs=['https://www.baidu.com', 'https://www.hao123.com', 'Http://cn.bing.com/?mkt=zh-CN&mkt=zh-CN&mkt=zh-CN&mkt=zh-CN&mkt=zh-CN&mkt=zh-CN', ] P = Pool () Url_ls=[] forUrlinchurls:url_ls.append (p.apply_async(get,args = (URL,),Callback=parse))#The main process is responsible for the callback function #The return value of the Get function-->> as a parameter to the parse functionp.close () p.join ( )
Network Programming Basics---Concurrent programming----Manager (Shared dictionary, list)---joinabelqueue----process Pool---callback function