Python co-process

Source: Internet
Author: User

Python coprocessor: from Yield/send to async/await

Transfer from http://python.jobbole.com/86069/

Python because of the well-known Gil reason, causing its thread to not play multi-core parallel computing ability (of course, later with multiprocessing, can achieve multi-process parallelism), looks more chicken. Now that there is only one thread running at the same time under the Gil, the switching overhead between threads is a drag for CPU-intensive programs, and the I/O-bottleneck program is what the Coprocessor excels at:

Multitasking concurrency (non-parallel), each task hangs (initiates I/O) and restores (I/O end) at the right time

The process in Python has gone through a long history. It has probably undergone the following three stages:

  1. Initial generator Warp Yield/send
  2. Introduction of @asyncio.coroutine and yield from
  3. Introduced the Async/await keyword in the most recent Python3.5 release
Speaking from yield

Let's look at an ordinary code that calculates the Fibonacci continuation:

def OLD_FIB (n): res = [0] * nindex = 0a = 0b = 1while Index < N:res[index] = ba, B = B, a + Bindex + = 1return Res print ( '-' *10 + ' test old fib ' + '-' *10 ' for Fib_res in Old_fib (:p rint (fib_res)

If we just need to get the nth bit of the Fibonacci sequence, or if we just want to generate the Fibonacci sequence, then this traditional way would be more memory-intensive.

At this point, yield comes in handy.

def fib (n): index = 0a = 0b = 1while Index < N:yield ba, B = B, a + Bindex + = 1 print ('-' *10 + ' test yield fib ' + '-' *10 ) for fib_res in fib:p rint (fib_res)

When a function contains a yield statement, Python automatically recognizes it as a generator. At this point, the FIB (20) does not actually call the function body, but instead generates a generator object instance in the function body.

Yield here you can keep the calculation field of the FIB function, pause the calculation of the FIB and return B. When the FIB is placed in the for...in loop, each loop invokes next (FIB (20)), wakes the generator, executes to the next yield statement until the stopiteration exception is thrown. This exception is captured by the For loop, causing the loop to jump out.

Send here.

As you can see from the above program, only data is currently flowing from the FIB (20) to the outside for loop, and if you can send data to the FIB (20), it is not possible to implement the co-process in Python.

As a result, the generator in Python has the Send function, and the yield expression also has a return value.

We use this feature to simulate the calculation of a slow Fibonacci sequence:

def STUPID_FIB (n): index = 0a = 0b = 1while Index < N:SLEEP_CNT = Yield bprint (' Let me think {0} secs '. Format (sleep_cnt)  ) Time.sleep (sleep_cnt) A, B = B, a + Bindex + = 1print ('-' *10 + ' test yield send ' + '-' *10) N = 20sfib = Stupid_fib (n) fib_res = Next (SFIB) while True:print (fib_res) try:fib_res = sfib.send (random.uniform (0, 0.5)) except Stopiteration:break

Where next (SFIB) is equivalent to Sfib.send (None), you can make the SFIB run to the first yield place to return. Subsequent sfib.send (random.uniform (0, 0.5)) sends a random number of seconds to SFIB, which is the return value of the yield expression for the current break. In this way, we can control the time that the process computes the Fibonacci sequence from the "main" program, which can be returned to the "main" program calculation results, perfect!

Yield from what is a ghost?

Yield from is used for refactoring generators, which are simple and can be used like this:

def copy_fib (n):p rint (' I am Copy from fib ') yield from FIB (n) print (' Copy end ') print ('-' *10 + ' test yield from ' + '-' *10 Fib_res in Copy_fib (:p rint (fib_res)

This is a simple way to use, but far from the full yield from. The yield from function can also be used as a conduit to pass the send message to the inner layer, and handle all kinds of anomalies, so the STUPID_FIB can also be wrapped and use:

def copy_stupid_fib (n):p rint (' I am copy from Stupid fib ') yield from STUPID_FIB (n) print (' Copy end ') print ('-' *10 + ' test Yie LD from and send ' + '-' *10] N = 20CSFIB = Copy_stupid_fib (n) fib_res = Next (CSFIB) while True:print (fib_res) Try:fib_res = CSF Ib.send (random.uniform (0, 0.5)) except Stopiteration:break

If there is no yield from, the copy_yield_from here will be particularly complex (because you have to handle all kinds of anomalies yourself).

Asyncio.coroutine and yield from

Yield from in the Asyncio module to flourish. Look at the sample code first:

@asyncio. Coroutinedef smart_fib (n): index = 0a = 0b = 1while Index < n:sleep_secs = Random.uniform (0, 0.2) yield from asy Ncio.sleep (sleep_secs) print (' Smart one think {} secs to get {} '. Format (sleep_secs, b)) A, B = B, a + Bindex + = 1 @asyncio. c Oroutinedef STUPID_FIB (n): index = 0a = 0b = 1while Index < n:sleep_secs = Random.uniform (0, 0.4) yield from Asyncio.slee P (sleep_secs) print (' Stupid one think {} secs to get {} '. Format (sleep_secs, b)) A, B = B, a + Bindex + = 1 if __name__ = = ' __ Main__ ': loop = Asyncio.get_event_loop () tasks = [Asyncio.async (SMART_FIB), Asyncio.async (STUPID_FIB (10)),] Loop.run_until_complete (asyncio.wait (Tasks)) print (' All fib finished. ') Loop.close ()

Asyncio is an event loop-based module that implements asynchronous I/O. With yield from, we can give control of the Asyncio.sleep to the event loop, then suspend the current coprocessor, and then the event loop determines when Asyncio.sleep is awakened and then executes the code backwards.

This may be more abstract, fortunately Asyncio is a Python-implemented module, then we look at what is done in Asyncio.sleep:

@coroutinedef Sleep (delay, Result=none, *, Loop=none): "" "    Coroutine that completes after a given time (in seconds)." " "Future    = Futures." Future (Loop=loop)    h = future._loop.call_later (delay,                                future._set_result_unless_cancelled, result)    try :        return (yield from future)    finally:        H.cancel ()

First, sleep creates a future object, which is used as a more inner-level co-object, passing the yield from the event loop, and secondly, it registers a callback function by invoking the Call_later function of the event loop.

By looking at the source of the future class, you can see that the future is a generator that implements the __iter__ object:

  Class Future: #blabla    ... def __iter__ (self):        if isn't Self.done ():            self._blocking = True            yield Self  # This tells Task to wait for COM Pletion.        Assert Self.done (), "yield from wasn ' t used with the future"        return Self.result ()  # may raise too.

So when our co-process yield from asyncio.sleep, the event loop is actually a practice with the future object. Each time the event loop calls Send (none), it is actually passed to the __ITER__ function call of the future object, and when the future is not completed, it will yield self, which means that it is temporarily suspended and waits for the next send (none) to wake up.

When we wrap a future object to produce a task object, in the Task object initialization, the future Send (None) is called and the callback function is set for the future.

Class Task (Futures. Future): #blabla ...    def _step (self, Value=none, exc=none): #blabla ...        Try:            If exc is not none:                result = Coro.throw (exc)            elif value was not none:                result = Coro.send (value)            El SE:                result = Next (Coro) #exception handle        else:            if isinstance (result, futures. Future):                # yielded-Must come from future.__iter__ ().                If result._blocking:                    result._blocking = False                    result.add_done_callback (self._wakeup) #blabla ...     def _wakeup (self, Future):        try:            value = Future.result ()        except Exception as exc:            # This could also be a Cancellation.            Self._step (None, exc)        else:            self._step (value, none) Self        = none  # Needed to break cycles when an exce Ption occurs

After the preset time, the event loop calls Future._set_result_unless_cancelled:

Class Future: #blabla    ... def _set_result_unless_cancelled (self, result): "" "Helper setting the result is only if the future is not        cancelled." "        if self.cancelled ():            return        self.set_result (result)     def set_result (self, result):        "" "Mark the Future do and set its result.         If The future is already do when this method is called, raises        invalidstateerror.        "" If self._state! = _pending:            raise Invalidstateerror (' {}: {!r} '. Format (self._state, self))        Self._result = Result        self._state = _finished        self._schedule_callbacks ()

This will change the state of the future, while recalling the previously set Tasks._wakeup, and in _wakeup, the tasks._step will be called again, when the state of the future is marked as complete, and therefore will no longer yield self, The return statement will trigger a stopiteration exception that will be captured by Task._step to set the result of the task. At the same time, the entire yield from chain will also be awakened, and the process will continue to execute downward.

Async and await

After figuring out the asyncio.coroutine and yield from, the async and await in the Python3.5 are understandable: they can be understood as the perfect doubles from the Asyncio.coroutine/yield. Of course, from the point of view of Python design, async/await allows the surface of the process to exist independently of the generator and hides the details under the Asyncio module, making the syntax clearer.

Async def SMART_FIB (n): index = 0a = 0b = 1while Index < n:sleep_secs = Random.uniform (0, 0.2) await Asyncio.sleep (sleep_ secs) print (' Smart one think {} secs to get {} '. Format (sleep_secs, b)) A, B = B, a + Bindex + = 1 Async def stupid_fib (n): IND ex = 0a = 0b = 1while Index < n:sleep_secs = Random.uniform (0, 0.4) await Asyncio.sleep (sleep_secs) print (' Stupid one thi NK {} secs to get {} '. Format (sleep_secs, b)) A, B = B, a + Bindex + = 1 if __name__ = = ' __main__ ': loop = asyncio.get_event_l OOP () tasks = [Asyncio.ensure_future (SMART_FIB), Asyncio.ensure_future (STUPID_FIB),]loop.run_until_ Complete (asyncio.wait (tasks)) print (' All fib finished. ') Loop.close ()

Summarize

At this point, the co-process in Python is complete. The example program is a representative of sleep asynchronous I/O, in the actual project, you can use the asynchronous read and write network, read and write files, rendering interface, etc., while waiting for the completion of the process, the CPU can also perform other calculations. The role of the co-process is here.

Python co-process

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.