10-Thread, process, coprocessor, IO multiplexing

Source: Internet
Author: User
Tags epoll string format

-Threading Process Introduction
1. Minimum working unit is a thread
2. Application--at least one process- and at least one thread
3. Application Scenario:
IO-Intensive: Threading
Compute Intensive: Process
4. GIL, Global interpreter lock.
-Ensure that only one thread in the same process is scheduled at the same time
-Thread
1. Basic use
def Task (ARG): Time    . Sleep (ARG)    print(ARG) for I in range (5):    t = Threading. Thread (Target=task,args=[i,])    # T.setdaemon (True) # The main thread terminates without waiting for the child thread    # T.setdaemon (False)     T.start() #    t.join ()  # always waiting    # t.join (1) # Wait for the maximum time
  2. Lock
# 1. Only one person can use a lock    # lock = Threading. Lock () # can only open one    # lock = Threading. Rlock () # can open a lot # 2. Multiple people use lock    # lock = threading at the same time. Boundedsemaphore (3)# 3. All the free lock limits    # lock = Threading. Event ()# 4. Arbitrary    # lock = Threading. Condition ()
3. Thread pool
Mode one: Direct processing
3. Thread pool mode one: direct processingdefTask (URL): """ Task performs two actions: download; save local """# All data for HTTP request response encapsulated in response    #-URL of Response.url request    #-Response.status_code response status Code    #-Response.text response content (string format)    #-Response.content response content (byte format)    # DownloadResponse = requests.get (URL)# Download content saved to localf =Open(' A.log ', ' WB ') F.Write(Response.content) F.Close() Pool = Threadpoolexecutor (2) url_list = [' http://www.oldboyedu.com ', ' http://www.autohome.com.cn ', ' http://www. Baidu.com ',]for URL in url_list:Print(' Start request ', URL)# Go to the connection pool to get the linkPool.submit (Task,url)
Mode two: Step-up processing
Mode two: Step-up processingdefSave (future): """ Only do Save# The future contains response""" Response = Future.result ()# Download content saved to localf =Open(' A.log ', ' WB ') F.Write(Response.content) F.Close()defTask (URL): ""Only do download Requests"""# All data for HTTP request response encapsulated in response    #-URL of Response.url request    #-Response.status_code response status Code    #-Response.text response content (string format)    #-Response.content response content (byte format)    # DownloadResponse = requests.get (URL)returnResponsepool = Threadpoolexecutor (2) url_list = [' http://www.oldboyedu.com ', ' http://www.autohome.com.cn ', ' http: Www.baidu.com ',]for URL in url_list:Print(' Start request ', URL)# Go to the connection pool to get the link    # The future contains responseFuture = Pool.submit (Task,url)# After the download is successful, the Save method is called automaticallyFuture.add_done_callback (Save)
-Process
1. Basic use
Import ProcessImporttimedef task (ARG): Time    . Sleep (ARG)    print(ARG) if __name__ = = '__main__': For    i in range:        p = Process (target=task,args= (i,))        P.daemon = True        # P.daemon = False        p.start()        p.join(1)     Print(' main process last ... ')
Process lock and wire lock are the same (using the same way)
2. Data sharing between processes
Something special.
-Array (' type ', length)
-Manager (). List ()/manager (). Dict ()
-Error is because, the main process is finished, the child process will be error
-Write an input or something, let the main process and so on, can prevent the error
-or jion becomes serial.
-Using process pools, you can avoid
-Third-party tools
From multiprocessingImportProcessfrom ThreadingImportThread """# Verify that data is not shared between processesdefTask (Num,li): Li.append (num)Print(LI)# Data is not shared, print out is a number [1] [2] ... if__name__ = = '__main__': v = [] for I in range (10):# p = Process (target=task,args= (I,v,))p = Thread (target=task,args= (I,v,)) p.Start()""""""# Way One: Process data sharingFrom multiprocessingImportProcess,arrayfrom ThreadingImportThreaddefTask (Num,li): li[num] = 1Print(List (LI))if__name__ = = '__main__': v = Array (' i ', 10)# V[0]For I in range: p = Process (target=task,args= (I,v,)) p.Start()""" From multiprocessingImportProcess,managerfrom ThreadingImportThreaddefTask (Num,li): Li.append (num)Print(LI)if__name__ = = '__main__': v = Manager (). List ()# v = Manager (). Dict ()For I in range: p = Process (target=task,args= (I,v,)) p.Start()# p.join ()Input (' >>> ')
3. Process Pool
Import Processpoolexecutor   def Call (ARG):         Data = Arg.result ()       print(data )   def Task (ARG):       print(ARG)       return arg +   if __name __ = = '__main__':       pool = processpoolexecutor (5) for       I in range:           obj = Pool.submit (task,i)           Obj.add_done_callback (call)

================== Conclusion ==================
IO Intensive: Threads
Compute Intensive: Process
-co-process 
PIP3 install Greenlet
The coprocessor is always a thread executing, one shard processing of the thread.
 from Greenlet import  greenletdef      Test1 (): print  (A) gr2.switch () print  (34) Gr2.switch () def  test2 (): print  (a) gr1.s Witch () print  (+) Gr1 = Greenlet (test1) GR2 = Greenlet (test2) gr1.switch () 

Two-time machining:
Custom:
Select implementation
Off-the-shelf:
PIP3 Install gevent
import Monkey; Monkey.patch_all ()import geventimport requestsdef f (URL):    response = Requests.get ( URL)    print(response.url,response.status_code) gevent.joinall ([        gevent.spawn (F, '/HTTP// www.oldboyedu.com/'),        gevent.spawn (F, ' http://www.baidu.com/'),        gevent.spawn (F, ' http://github.com/'),])
-io multiplexing
Monitor multiple socket objects for changes (readable, writable, send errors)
-Example One:
ImportSocketImportSelect# IO multiplexing: 8002,8001################################ "pseudo" concurrency ############################### based on select for service endSk1 = Socket.socket () sk1.bind ((' 127.0.0.1 ', 8001,)) Sk1.listen (5) Sk2 = Socket.socket () sk2.bind ((' 127.0.0.1 ', 8002,)) Sk2.listen (5) inputs = [sk1,sk2,]w_inputs = []while True:# IO multiplexing, listening to multiple socket objects simultaneously    #-Select, internal loop operation (1024) active view    #-Poll, Internal loop operation active view    #-Epoll, passive notificationR,w,e = Select.select (inputs,w_inputs,inputs,0.05)# r:readble w:writable e: Exception    # r = [Sk2,]    # r = [Sk1,]    # r = [Sk1,sk2]    # r = []    # r = [Conn,]    # r = [Sk1,wconn]    #######? For obj in R:ifobj in [Sk1,sk2]:# New connection picked up ...            Print(' New connection came: ', obj) conn,addr = obj.accept () inputs.append (conn)Else:# There are connected users to send messages to :            Print(' There is a user sending data: ', obj ' try:Data= OBJ.RECV (1024x768) except Exception as ex:Data= ""if Data: W_inputs.append (obj)# Obj.sendall (data)            Else: obj.Close() inputs.remove (obj) w_inputs.remove (obj) for obj in W:obj.sendall (b ' OK ') W_inputs.remove (obj)# Socket ObjectSk1 = Socket.socket () sk1.bind ((' 127.0.0.1 ', 8001,)) Sk1.listen (5) while True:# Conn Socket object,CONN,ADDR = Sk.accept () conn.recv () Conn.sendall ()


-Socketserverio
-io multiplexing
-Thread
ImportSocketImportSelectImportThreading# IO multiplexing: 8002,8001################################ "pseudo" concurrency ############################### based on select for service end"""defProcess_request (conn): While true:v = Conn.recv (1024x768) conn.sendall (b ' 1111 ') Sk1 = Socket.socket () sk1.bind ((' 127.0.0.1 ', 8001,)) Sk1.listen (5) Inputs=[sk1,]while True:# IO multiplexing, listening to multiple socket objects simultaneously    #-Select, internal loop operation (1024) active view    #-Poll, Internal loop operation active view    #-Epoll, passive notificationR,w,e = Select.select (inputs,[],inputs,0.05) for obj in R:ifObj in Sk1:# conn Client socketCONN,ADDR = obj.accept () t = Threading. Thread (target=process_request,args= (conn,)) T.Start()"""# import Socketserver## class MyHandler (Socketserver. Baserequesthandler):# def handle (self):# Pass### server = Socketserver. Threadingtcpserver ((' 127.0.0.1 ', 8001), MyHandler)# server.serve_forever ()

-Custom asynchronous non-blocking framework

10-Thread, process, coprocessor, IO multiplexing

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.