One, multi-threaded
ImportThreading fromTimeImportCtime,sleepdefMusic (func): forIinchRange (2): Print("I was listening to%s.%s"%(Func,ctime ())) Sleep (1)defMove (func): forIinchRange (2): Print("I was at the %s!%s"%(Func,ctime ())) Sleep (5) Threads=[]t1= Threading. Thread (target=music,args= (U'Love Business',)) Threads.append (t1) T2= Threading. Thread (target=move,args= (U' Avatar',)) Threads.append (T2)if __name__=='__main__': forTinchThreads:t.setdaemon (True) T.start () T.join ()Print("All over %s"%ctime ())
Second, thread pool (self-implementation)
" "the thread pool concept is that we will 1000 pieces of work, originally by 1000 people to do, now only 5 people to do, these 5 people are the number of thread pools, and they are in a running state, unless the main program ends, otherwise, will not end. " " fromQueueImportQueue fromThreadingImportThreadImportRandomImport TimedefPerson (I,Q): whileTrue:#This man has been in a state where he can rejoined dry.Q.get ()Print("Thread"I"Is doing the job") Time.sleep (Random.randint (1,5))#Each person's working time is different, naturally will cause each person to allocate the number of pieces (here is the place of work)Q.task_done ()#the job you received is done, report up .Q=Queue ()#allocate 1000 pieces of live forXinchRange (100): Q.put (x)#called 5 men to work . forIinchRange (5): Worker=thread (Target=person, args=(I,Q)) Worker.setdaemon (True) Worker.start () Q.join ()#The 5 men finished the 1000 pieces of work.
third, thread pool (library implementation)
Look at it! Just use 4 lines of code to get it done! Three of these lines are still a fixed notation.
ImportRequests fromMultiprocessing.dummyImportPool as ThreadPool URLs= [ 'http://www.baidu.com', 'http://www.163.com', 'http://www.sina.cn', 'http://www.live.com', 'http://www.mozila.org', 'http://www.sohu.com', 'http://www.tudou.com', 'http://www.qq.com', 'http://www.taobao.com', 'http://www.alibaba.com', ]#Make the Pool of workersPool = ThreadPool (4) #Note the map function here!!!! #Open the URLs in their own threads#and return the resultsResults =Pool.map (requests.get, URLs)#Close the pool and wait for the work to finishpool.close () pool.join ( )
from Import Pool def f (x): return x*xwith Pool (5) as P: Print(P.map (f, [1, 2, 3])
Iv. How to be more efficient (production, consumer model)
It's a lot simpler, more efficient, easier to understand than the classic way, and there's no deadlock trap.
fromMultiprocessingImportPool, QueueImportRedisImportRequestsqueue= Queue (20)defconsumer (): R= Redis. Redis (host='127.0.0.1', port=6379,db=1) whiletrue:k, url= R.blpop (['Pool',]) Queue.put (URL)defworker (): whileTrue:url=Queue.get ()Print(Requests.get (URL). Text)defprocess (ptype):Try: ifPtype:consumer ()Else: Worker ()except: PassPool= Pool (5)PrintPool.map (Process, [1, 0,0,0,0]) Pool.close () Pool.join ( )
Python threads and thread pools