The thread replaces the EPLL implementation principle (yield calls next when the function uses other threads to process, does not affect the main thread to continue running, next asynchronous processing thread processing using send back processing results)

Source: Internet
Author: User
Tags epoll

1 Actual instructions for async

For a time-consuming process, we give it to someone else (like another thread) to execute, and we continue to work on it, and when someone else takes the time-consuming action and then feeds back the results, that's what we call async.

We use an easy-to-understand threading mechanism to implement Asynchrony.

2. Principle of the implementation of the co-process notation

When using a callback function to write an asynchronous program, the code that belongs to an execution logic (processing request a) is split into two functions req_a and on_finish, which differs greatly from the synchronization procedure. The synchronization program is easier to understand the business logic, so can we write asynchronous programs in the same code?

Recall the role of the yield keyword? Initial version
# Coding:utf-8Import timeImport Threadgen =None# Global builder for LONG_IO useDefLong_io():DefFun():Print"Start IO operation"Global Gen Time.sleep (5)TryPrint"Completion IO operation, and send result wakeup suspend program continues" Gen.send ("IO result")# Use Send to return results and wake the program to continue executionExcept stopiteration:# The capture generator completes the iteration to prevent the program from exitingPass Thread.start_new_thread (fun, ())DefReq_a():Print"Start processing request Req_a" RET =Yield Long_io ()Print "ret:%s"% ret print  "complete processing request Req_a" Span class= "hljs-function" >def req_b  (): print  "Start processing request Req_b" Time.sleep (2) print  "complete processing request Req_b" def main (): global gen Gen = Req_a () gen.next () # turn on generator req_a execution Req_b () while  1: passif __ name__ = =  ' __main__ ': Main ()         

Execution process:

开始处理请求req_a开始处理请求req_b开始执行IO操作完成处理请求req_b完成IO操作,并send结果唤醒挂起程序继续执行ret: io result完成处理请求req_a
Upgrade version

The version we wrote above although Req_a is written in a very similar manner to synchronous code, it is not simple to call req_a as a normal function in main, but to be treated as a generator.

Now, we are trying to modify, so that req_a and main are written in a similar and synchronous code.

# Coding:utf-8Import timeImport Threadgen =None# Global builder for LONG_IO useDefGen_coroutine(f):DefWrapper(*args, **kwargs):Global Gen Gen = f () Gen.next ()Return wrapperDefLong_io():DefFun():Print"Start IO operation"Global Gen Time.sleep (5)TryPrint"Completion IO operation, and send result wakeup suspend program continues" Gen.send ("IO result")# Use Send to return results and wake the program to continue executionExcept stopiteration:# The capture generator completes the iteration to prevent the program from exitingPass Thread.start_new_thread (fun, ())@gen_coroutineDefReq_a():Print"Start processing request Req_a" RET =yield long_io () print "ret:%s"% ret print "Finish processing request Req_a"def req_b(): print "start Request Req_b "Time.sleep (2) print " Complete processing request Req_b "def Main(): Req_a () Req_b () while  1 : Passif __name__ = = ' __main__ ': Main ()          

Execution process:

开始处理请求req_a开始处理请求req_b开始执行IO操作完成处理请求req_b完成IO操作,并send结果唤醒挂起程序继续执行ret: io result完成处理请求req_a
Final version

The newly completed version is still not ideal because there is a global variable Gen for Long_io to use. We now rewrite the program again, eliminating the global variable Gen.

# Coding:utf-8Import timeImport ThreadDefGen_coroutine(f):DefWrapper(*args, **kwargs): Gen_f = f ()# Gen_f for generator req_a r = Gen_f.next ()# R for Generator Long_ioDefFun(g): ret = G.next ()# Execution Generator Long_ioTry:gen_f.send (ret)# return the result to req_a and let it continue executionExcept stopiteration:Pass Thread.start_new_thread (fun, (R,))Return wrapperDefLong_io():Print"Start IO operation" Time.sleep (5)Print"Completion IO operation, yield back operation result"Yield"IO result"@gen_coroutineDefReq_a():Print"Start processing request Req_a" RET =yield long_io () print "ret:%s"% ret print "Finish processing request Req_a"def req_b(): print "start Request Req_b "Time.sleep (2) print " Complete processing request Req_b "def Main(): Req_a () Req_b () while  1 : Passif __name__ = = ' __main__ ': Main ()          

Execution process:

开始处理请求req_a开始处理请求req_b开始执行IO操作完成处理请求req_b完成IO操作,yield回操作结果ret: io result完成处理请求req_a

This final version is the simplest model to understand the Tornado asynchronous programming principle, but the mechanism of tornado implementation is not a thread, but a epoll, which will be handed over to epoll for execution and monitoring callbacks for the asynchronous process.

One thing to note is that the version we implement is strictly not a co-process, because the two program hangs and wakes are implemented on two threads, while Tornado uses Epoll to implement Asynchrony, the program hangs and wakes up on a thread, and is dispatched by tornado itself. Belong to the true sense of the association process. This, though, does not prevent us from understanding the principle of tornado asynchronous programming.

The thread replaces the EPLL implementation principle (yield calls next when the function uses other threads to process, does not affect the main thread to continue running, next asynchronous processing thread processing using send back processing results)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.