Python concurrent programming process pool, thread pool Concurrent.futures

Source: Internet
Author: User

Process pool and thread pool

When we started to learn multi-process or multi-threading, we couldn't wait to implement concurrent socket communication based on multi-process or multithreading, but the fatal flaw of this approach is that the number of open processes or threads of the service will increase with the number of concurrent clients.

This can put a lot of pressure on the server host, or even overwhelmed, so we have to control the number of processes open on the server or the number of threads, so that the machine can be run in a way that is acceptable to the server, that is the purpose of the process pool or the thread pool,

The process pool, for example, is the pool used to store the process, essentially a multi-process, but a limit to the number of open processes

Python--concurrent.futures
The 1.concurent.future module is used to create parallel tasks, providing a higher level of interface,
To execute the call asynchronously
2.concurent.future This module is very convenient to use, its interface is also packaged in a very simple
The 3.concurent.future module can implement both process pool and thread pool
4. Module import process pool and thread pool
From concurrent.futures Import processpoolexecutor,threadpoolexecutor
p = processpoolexecutor (Max_works) for a process pool if you do not write Max_works: The default is the number of CPUs
p = threadpoolexecutor (max_works) for thread pooling if you do not write Max_works: The default is the number of CPUs
Basic methods
1. Submit (FN, *args, **kwargs) asynchronously submits the Task 2, Map (func, *iterables, Timeout=none, chunksize=1) instead of the For loop submit action 3, shutdown ( Wait=true) is equivalent to the Pool.close () +pool.join () operation Wait=true of the process pool, waits until all the tasks in the pool have been completed to reclaim resources before continuing Wait=false, returning immediately, Does not wait for the task in the pool to complete but regardless of the wait parameter, the entire program waits until all tasks have been executed. Submit and map must be before shutdown 4, result (Timeout=none) results 5, Add_done_ Callback (FN) callback function

 

Process Pool
From concurrent.futures Import processpoolexecutor,threadpoolexecutorfrom threading import Currentthreadimport OS, Time,randomdef task (n):    print ("%s is running"% os.getpid ())    Time.sleep (Random.randint (1,3))    return n 2if __name__ = = ' __main__ ':    start = Time.time ()    executor = Processpoolexecutor (4)    res = [] for i in    range ( #):  # Open 10 tasks Future        = Executor.submit (task,i)  # Asynchronous Commit task        res.append (future)    Executor.shutdown ()  # waits for all processes to execute print    ("++++>") for    R in Res: print        (R.result ())  # Printing results    end = Time.time ()    print (end-start)---------------------output 2464 is running 9356 are running 10780 is running 918 0 is running 2464 are running 10780 is running 9180 are running 9356 is running 10780 are running 9180 is running ++++>024 6810121416186.643380165100098
Thread pool
From concurrent.futures Import processpoolexecutor,threadpoolexecutorfrom threading import Currentthreadimport OS, Time,randomdef task (N): Print ("%s is running"% CurrentThread (). GetName ()) Time.sleep (Random.randint (1,3)) retur n n*2if __name__ = = ' __main__ ': start = time.time () executor = Threadpoolexecutor (4) # thread pool res = [] for i in  Range (10): # Open 10 Tasks future = Executor.submit (task,i) # Asynchronous Commit task Res.append (future) Executor.shutdown () # Wait for all threads to finish executing print ("++++>") for R in Res:print (R.result ()) # Print Results end = Time.time () print (end-st ART)------------output <concurrent.futures.thread.threadpoolexecutor object at 0x00000000025b0da0>_0 is running <concurrent.futures.thread.threadpoolexecutor object at 0x00000000025b0da0>_1 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_2 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_3 is running<concurrent.futures.thread.threadpoolexecutor object at 0x00000000025b0da0>_3 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_1 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_0 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_2 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_3 is running < Concurrent.futures.thread.ThreadPoolExecutor object at 0x00000000025b0da0>_1 is running ++++> 0246810121416185.002286195755005
callback function
Import requestsimport timefrom concurrent.futures import threadpoolexecutordef get (URL): print (' get {} '. Format (URL)) Response = requests.get (URL) time.sleep (2) if Response.status_code = = 200: # 200 stands for Status: Download succeeded return {' URL ': u RL, ' content ': Response.text}def Parse (res): Print ('%s parse res is%s '% (res[' url '], len (res[' content '))) return '%s Parse res is%s '% (res[' url '], len (res[' content ')) def Save (RES): print (' Save ', res) def task (res): res = Res.re Sult () Par_res = Parse (res) Save (par_res) If __name__ = = ' __main__ ': urls = [' http://www.cnblogs.com/l Inhaifeng ', ' https://www.python.org ', ' https://www.openstack.org ',] pool = threadpoolexec Utor (2) for I in Urls:pool.submit (GET, I). Add_done_callback (Task) #这里的回调函数拿到的是一个对象. # First get a result of the returned Res. That is, adding a res.result () #谁好了谁去掉回调函数 # callback function is also a programming idea. Not only the thread pool, but also the Pool.shutdown () #相当于进程池里的close和join-------------outputGET Http://www.cnblogs.com/linhaifengGET Https://www.python.orghttp://www.cnblogs.com/linhaifeng Parse Res is 17426save Http://www.cnblogs.com/linhaifeng Parse res is 17426GET https://www.openstack.orghttps://www.python.org Parse res is 48809save https://www.python.org Parse res are 48809https://www.openstack.org parse res is 60632save https://w Ww.openstack.org Parse Res is 60632

Map
Import requestsimport timefrom concurrent.futures import threadpoolexecutordef get (URL):    print (' Get {} '. Format ( URL))    response = requests.get (URL)    time.sleep (2)    if Response.status_code = =:  # 200 stands for status: Download succeeded        return {' url ': URL, ' Content_len ': Len (response.text)}if __name__ = = ' __main__ ':    urls = [            '/HTTP]/ Www.cnblogs.com/linhaifeng ',            ' https://www.python.org ',            ' https://www.openstack.org ',        ]    pool = Threadpoolexecutor (2)    res = Pool.map (GET, URLs) #map取代了for +submit    Pool.shutdown ()   # Equivalent to the close and join print in the process pool    (' = ' *)    for R in Res: # returns an iterator        print (r) GET http://www.cnblogs.com/ Linhaifengget https://www.python.orgGET https://www.openstack.org{' url ': ' Http://www.cnblogs.com/linhaifeng ', ' Content_len ': 17426}{' url ': ' https://www.python.org ', ' Content_len ': 48809}{' url ': ' https://www.openstack.org ', ' Content_len ': 60632}

  

  

Python concurrent programming process pool, thread pool Concurrent.futures

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.