How Python deals with examples of concurrency problems through the future

Source: Internet
Author: User
This article mainly introduces Python through the future to deal with the concurrency problem, very good, with reference value, the need for friends can refer to the following

The first knowledge of the future

A preliminary understanding of the future is made through the following scripts:

Example 1: Ordinary way through the loop

Import Osimport timeimport Sysimport requestspop20_cc = ("CN in US ID BR PK NG BD RU JP MX PH VN ET EG DE IR TR CD FR"). S Plit () Base_url = ' http://flupy.org/data/flags ' dest_dir = ' downloads/' def save_flag (img,filename): Path = Os.path.join ( Dest_dir,filename) with open (path, ' WB ') as Fp:fp.write (IMG) def get_flag (cc): url = "{}/{cc}/{cc}.gif". Format (Base_url, Cc=cc.lower ()) resp = requests.get (URL) return resp.contentdef show (text): Print (text,end= "") Sys.stdout.flush () def Download_many (cc_list): For CC in sorted (cc_list): image = Get_flag (cc) show (cc) Save_flag (image,cc.lower () + ". gif") Return Len (cc_list) def main (download_many): T0 = time.time () Count = Download_many (pop20_cc) elapsed = Time.time ()-t0 msg = "\n{} flags downloaded in {:. 2f}s" Print (Msg.format (count,elapsed)) if __name__ = = ' __main__ ': Main (Download_many)

Example 2: This is done through the future, where some of the code above is reused

From concurrent import Futuresfrom flags import Save_flag, Get_flag, show, mainmax_workers = 20def download_one (cc): Image = Get_flag (CC) show (cc) Save_flag (image, Cc.lower () + ". gif") return ccdef Download_many (cc_list): Workers = min (Max_worke Rs,len (Cc_list)) with futures. Threadpoolexecutor (workers) as Executor:res = Executor.map (Download_one, Sorted (cc_list)) Return Len (List (res)) if __ name__ = = ' __main__ ': Main (Download_many)

Run three times respectively, the average speed of both: 13.67 and 1.59s, you can see the difference is very large.

Future

The future is an important component of the Concurrent.futures module and the Asyncio module

Starting from python3.4 There are two classes named future in the standard library: Concurrent.futures.Future and Asyncio.future
These two classes have the same effect: two instances of the future class represent a possible or unfinished deferred calculation. Similar to the functionality of the deferred class in Twisted, the future class in the Tornado framework

Note: Typically, you should not create a future, but are instantiated by the concurrency framework (concurrent.futures or Asyncio)

Cause: The future represents what will happen, and the only way to make sure something happens is that the time has been set for execution, Therefore, the Concurrent.futures.Future instance is created only when a certain thing is handed over to the Concurrent.futures.Executor subclass for processing.
For example, the parameter of the Executor.submit () method is a callable object that, when called, schedules the incoming callable object and returns a

Future

The client code should not change the state of the future, and the concurrency framework will change the state of the period after the delay calculation of the future represents, and we cannot control when the calculation ends.

Both kinds of future have the. Done () method, which does not block, and the return value is a Boolean value indicating whether the callable object of the future link has been executed. The client code usually does not ask whether the future is finished or not, but waits for notification. So the two future classes have the. Add_done_callback () method, which has only one parameter, and the type is a callable object, and the specified callable object is invoked after the future run ends.

The. Result () method is the same in two future classes: Returns the result of a callable object, or throws an exception that is thrown when a callable object is executed. However, if the future does not end up running, the result method behaves differently in two futrue classes.

For concurrent.futures.Future instances, the call. Result () method blocks the caller's thread until a result can be returned, at which point the result method can receive an optional timeout parameter. A Timeouterror exception is thrown if the future is not finished running within the specified time.

and Asyncio. The Future.result method does not support setting the time-out, and it is best to use the yield from structure in obtaining future results, but concurrent.futures.Future cannot do so

Either Asyncio or concurrent.futures.Future will have several functions that return to the future, other functions using the future, In the first example we used the EXECUTOR.MAP to use the future, the return value is an iterator, the iterator's __next__ method calls the result method of each future, so we get the results of each futrue, And not the future itself.

With regard to the use of the future.as_completed function, here we use two loops, one for creating and scheduling the future, and the other for getting the future results.

From concurrent import Futuresfrom flags import Save_flag, Get_flag, show, mainmax_workers = 20def download_one (cc): Image = Get_flag (CC) show (cc) Save_flag (image, Cc.lower () + ". gif") return ccdef Download_many (cc_list): Cc_list = Cc_list[:5] Wi TH futures. Threadpoolexecutor (max_workers=3) as Executor:to_do = [] for cc in sorted (cc_list): Future  = Executor.submit (Downloa D_ONE,CC)  to_do.append (future)  msg = ' secheduled for {}:{} '  print (Msg.format (cc,future)) results = [] for Future in Futures.as_completed (TO_DO):  res = Future.result ()  msg = "{}result:{!r}"  Print (Msg.format ( future,res))  Results.append (res) return Len (results) if __name__ = = ' __main__ ': Main (Download_many)

The results are as follows:

Note: Python code is unable to control the Gil, all functions that perform blocking IO operations in the standard library, and will release the Gil when waiting for the operating system to return results. Running other threads executes, and that's why Python threads can play a role in IO-intensive applications

The above is the Concurrent.futures boot thread, which starts the process below it

Concurrent.futures START Process

The Processpoolexecutor class in Concurrent.futures assigns work to multiple Python processes, so if CPU intensive processing is required, using this module can bypass the Gil and take advantage of all CPU cores.

The principle is that a processpoolexecutor creates n separate python interpreters, and N is the number of CPU cores available on the system.

Use the same method as the Threadpoolexecutor method

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.