Only I can read-python thread, process, Ctrip, I/O synchronization, asynchronous
give me a chestnut :
I want to get three URLs, first with a normal for loop
ImportRequests fromMultiprocessingImportProcess fromThreadingImportThreadImportRequestsImport Time#-----Normal traversal of the serial synchronization-----defget_page (URL): page=requests.get (URL)Print(URL) Start=time.time () URLs= ['http://jandan.net/','https://www.python.org','http://www.gamersky.com/'] forIinchurls:get_page (i) End=time.time ()Print(End-start)
# this is normal.
This time I'm going to use the threading module to get him a little faster.
defget_page (URL): page=requests.get (URL) Start=time.time () List= []#List of Thread objectsURLs = ['http://jandan.net/','http://www.xiaohuar.com/','http://www.gamersky.com/'] forIinchUrls:ok= Thread (target=get_page,args= (i,))#target is the function name, and args is the parameter passed to the functionlist.append (OK)#get_page (i) forIinchList:i.start ()#Start a thread forIinchList:i.join ()#The main thread waits for the child thread to finish after executionEnd =time.time ()Print(End-start)
# opened three threads to run
# If the thread uses the Join () function (the main thread code will stop in join), the main process waits for the child thread to finish executing
# If the child thread (Ok.setdeamon (True)becomes a daemon thread, the main thread will not wait for the child thread to finish executing, and when the main thread ends, it will be forced to terminate, regardless of whether the child thread finishes execution!
Python threads, processes, Ctrip, I/O synchronization, asynchronous