This article briefly introduces the concurrent programming implemented by Using generators in Python. Using the yield generator function for multi-process programming is an important part of Python's advanced learning, for more information, see concurrency (not parallel) programming. Currently, there are four methods: multi-process, multi-thread, asynchronous, and collaborative.
Multi-process programming has a C-like OS in python. fork, of course, also has a higher-level encapsulated standard multiprocessing library. In the python high-availability programming method previously written, it provides a signal processing method similar to that between master process and worker process in nginx, this ensures that the exit of the business process can be perceived by the main process.
Multi-threaded programming python contains Thread and threading. in linux, the so-called Thread is actually a lightweight LWP process. In the kernel, it has the same scheduling mode as the process, COW (copy at write time), fork, vfork, clone and other materials are much more, so we will not repeat them here.
In linux, there are three main asynchronous implementations: select, poll, and epoll. Asynchronous is not the focus of this article.
The coroutine must say yield. Let's look at an example:
# Coding = utf-8import timeimport sys # producer def produce (l): I = 0 while 1: if I <5: l. append (I) yield I = I + 1 time. sleep (1) else: return # consumer def consume (l): p = produce (l) while 1: try: p. next () while len (l)> 0: print l. pop () Comment t StopIteration: sys. exit (0) l = [] consume (l)
In the preceding example, when the program runs to the yield I of produce, a generator is returned. When we call p. next (), the program returns to the yield I of produce to continue execution, so that the element is appended to l, And then we print l. pop () until p. next () causes a StopIteration exception.
Through the above example, we can see that the coroutine scheduling is invisible to the kernel, and the coroutine scheduling is collaborative, so that when the concurrency is tens of thousands, the performance of coroutine is much higher than that of thread.
import stacklessimport urllib2def output(): while 1: url=chan.receive() print url f=urllib2.urlopen(url) #print f.read() print stackless.getcurrent() def input(): f=open('url.txt') l=f.readlines() for i in l: chan.send(i)chan=stackless.channel()[stackless.tasklet(output)() for i in xrange(10)]stackless.tasklet(input)()stackless.run()
For more information about coroutine, see greenlet, stackless, gevent, and eventlet.