4. Generator (generator)
4.1. Introduction to Generators
First, make sure that the generator is an iterator. The generator has the next method and behaves exactly the same as the iterator, which means that the generator can also be used in the Python for loop. In addition, the special syntax support for generators makes it much easier to write a generator than to customize a regular iterator, so the generator is one of the most commonly used features.
Starting with Python 2.5, the implementation of [PEP 342: Implementing a synergistic program through the enhanced Builder] adds more features to the generator, which means that the generator can do more work. This section will be covered in a later section.
4.2. Generator functions
4.2.1. Using the generator function to define the generator
How do I get a generator? First look at a small piece of code:
The code is as follows:
>>> def get_0_1_2 ():
... yield 0
... yield 1
... yield 2
...
>>> get_0_1_2
We have defined a function get_0_1_2 and can see that it is indeed a function type. But unlike the general function, Get_0_1_2 's function body uses the keyword yield, which makes get_0_1_2 a generator function. The properties of the generator function are as follows:
1. Call the generator function to return a generator;
The code is as follows:
>>> generator = Get_0_1_2 ()
>>> Generator
2. When the next method of the generator is called for the first time, the generator starts executing the generator function (as opposed to building the generator) until it encounters yield and pauses execution (hangs), and the yield parameter is used as the return value of the next method;
The code is as follows:
>>> Generator.next ()
0
3. After each call to the generator's next method, the generator resumes execution of the generator function from the location where it was last paused until the yield is encountered again, and the yield parameter will be the return value of the next method;
The code is as follows:
>>> Generator.next ()
1
>>> Generator.next ()
2
4. If the generator function ends when the next method is called (encountering an empty return statement or reaching the end of the function body), the call to the next method throws a Stopiteration exception (that is, the terminate condition for the For loop);
The code is as follows:
>>> Generator.next ()
Traceback (most recent):
File " ", line 1, in
Stopiteration
5. Each time the generator function pauses execution, all variables in the body of the function are sealed (freeze) in the generator, restored at recovery execution, and similar to closures, even if the generators returned by the same generator function are independent of each other.
The variables are not used in our small example, so here is another generator to show this feature:
The code is as follows:
>>> def Fibonacci ():
... a = b = 1
... yield a
... yield b
... while True:
... a, B = B, a+b
... yield b
...
>>> for Num in Fibonacci ():
... if num > 100:break
.. print num,
...
1 1 2 3 5 8 13 21 34 55 89
It is not surprising to see while true because the generator can hang, so it is deferred, and there is no relationship between infinite loops. In this example, we define a generator to get the Fibonacci sequence.
4.2.2. FAQ for Generator functions
Next, let's discuss some interesting topics about generators.
1. In your example, the generator function has no parameters, so can the generator function take parameters?
Yes, of course. Pro, and it supports all parameter forms of the function. Be aware that a generator function is also a function:)
The code is as follows:
>>> def counter (start=0):
... while True:
... yield start
... start + = 1
...
This is a counter that starts with the specified number.
2. Since the generator function is also a function, can it return a value using the return output?
No, it's like this. The generator function already has the default return value-The generator, you can not give another return value; Yes, even return none. But it can end with an empty return statement. If you insist on specifying a return value for it, then Python will give a syntax error exception at the defined location, like this:
The code is as follows:
>>> def i_wanna_return ():
... yield None
... return None
...
File " ", line 3
SyntaxError: ' Return ' with argument inside generator
3. Well, the people need to ensure that the release of resources, need to yield in the try...finally, this will be God horse situation? (I just want to play with you) I also have a yield in finally!
Python will then execute the code in finally when it really leaves try...finally, and here I regret to tell you that the pause does not count! So you can guess the end of it!
The code is as follows:
>>> def Play_u ():
... try:
... yield 1
... yield 2
... yield 3
... finally:
... yield 0
...
>>> for Val in Play_u (): Print Val,
...
1 2 3 0
* This is different from the return situation. Return is the real exit code block, so the finally clause is executed immediately upon return.
* In addition, the "yield in a try block with a finally clause" is defined in PEP 342, which means that only Python version 2.5 or more supports this syntax, and in Python version 2.4 The syntax error exception is obtained.
4. What if I need to access another generator's iterations during the iteration of the generator? Write down the following such a good silly good naïve.
The code is as follows:
>>> def sub_generator ():
... yield 1
... yield 2
... for Val in counter (Ten): Yield Val
...
The syntax improvements for this situation have been defined in the [PEP 380: Delegate to sub-generator syntax], which is said to be implemented in Python 3.3, and may also be back to 2.x. Once implemented, you can write this:
The code is as follows:
>>> def sub_generator ():
... yield 1
... yield 2
... yield from counter (10)
File " ", line 4
Yield from counter (10)
^
Syntaxerror:invalid syntax
See a syntax error in wood? Now let's just be naïve.
Have more questions? Please reply to this article:)
4.3. Synergy Program (Coroutine)
A synergistic program (co-process) generally refers to such a function:
1. There are different local variables, instruction pointers, but still share the global variables;
2. Can be conveniently suspended, resumed, and has multiple entry points and exit points;
3. Multiple collaborators behave as collaborative operations, such as the result of B in the run of a, to continue execution.
The nature of the process determines that only one cooperative program is running at the same time (ignoring multithreading). Thanks to this, the process can pass objects directly without having to consider resource locks or waking other threads directly without the need for active hibernation, like a thread with a built-in lock. In applications that conform to the features of the coprocessor, the use of the co-process is undoubtedly more convenient than using threads.
On the other hand, the process can not be concurrent in fact, it also limits its application to a very narrow range, this feature makes the association more to be compared with the regular function, rather than with the thread. Of course, the threads are much more complex and more powerful than the co-routines, so I recommend that you have a firm grasp of threads: Python threading Guide
In this section, I will not enumerate the examples of the association process, the following methods to understand.
Python 2.5 's enhancements to the generator implement other features of the process, in which the generator joins the following methods:
1.send (value):
Send is another way to restore the generator, except for next. In Python 2.5, the yield statement becomes the yield expression, which means that yield can now have a value, which is called when the generator's send method is called to resume execution, calling the Send method's parameters.
The code is as follows:
>>> def repeater ():
... n = 0
... while True:
... n = (yield n)
...
>>> r = Repeater ()
>>> R.next ()
0
>>> R.send (10)
10
* The generator must be in a pending state before calling send to pass in a value other than none, otherwise an exception will be thrown. However, a generator that is not started can still call send with none as a parameter.
* If you use next to restore the generator, the yield expression will be the value of none.
2.close ():
This method is used to close the generator. Calling next or send again after the generator is closed throws a Stopiteration exception.
3.throw (Type, Value=none, Traceback=none):
This method is used to throw an exception inside the generator (where the generator is currently suspended, or at definition at the time it is not started).
* Don't be sorry for not seeing the example, the most common use of the process is the generator.
4.4. An interesting library: pipe
In this section I would like to give you a brief introduction to pipe. Pipe is not a python built-in library, if you install Easy_install, you can install it directly, otherwise you need to download it yourself: http://pypi.python.org/pypi/pipe
This library is introduced because it shows us a very new way to use iterators and generators: streams. The pipe sees the iterated data as a stream, similar to linux,pipe using ' | ' The flow of data is passed and a series of "stream processing" functions are defined to accept and process the data flow, and eventually output the data stream again or generalize the data flow to get a result. Let's look at some examples.
The first one, very simple, uses add to sum:
The code is as follows:
>>> from Pipe import *
>>> Range (5) | Add
10
Mating numbers and the need to use the where, acting like the built-in function filter, filter out eligible elements:
The code is as follows:
>>> Range (5) | Where (lambda x:x% 2 = = 0) | Add
6
Remember the Fibonacci sequence generator we defined? Find all the numbers less than 10000 in the sequence number and need to use take_while, and itertools function with the same name similar function, intercept elements until the condition is not established:
The code is as follows:
>>> fib = Fibonacci
>>> fib () | Where (lambda x:x% 2 = = 0) \
... | Take_while (lambda x:x < 10000) \
... | Add
3382
You need to apply a function to an element you can use Select, which acts like a built-in function map; you need to get a list that you can use As_list:
The code is as follows:
>>> fib () | Select (Lambda x:x * * 2) | Take_while (Lambda x:x < 100) | As_list
[1, 1, 4, 9, 25, 64]
More stream processing functions are also included in the pipe. You can even define a stream handler yourself, you just need to define a generator function and add a decorator pipe. A stream handler function that obtains an element until the index does not meet the criteria is defined as follows:
The code is as follows:
>>> @Pipe
... def take_while_idx (iterable, predicate):
... for IDX, x in Enumerate (iterable):
... if predicate (IDX): Yield x
... else:return
...
Use this stream handler to get the first 10 digits of the FIB:
The code is as follows:
>>> fib () | Take_while_idx (lambda x:x < 10) | As_list
[1, 1, 2, 3, 5, 8, 13, 21, 34, 55]
More functions are not introduced here, you can view the pipe source files, a total of 600 lines of the file 300 lines is a document, the document contains a large number of examples.
Pipe is very simple to implement, using the pipe decorator, the ordinary generator function (or return the function of the iterator) agent in an implementation of the __ror__ method of ordinary class instances, but this idea is really interesting.
Functional Programming Guide Full text Here is all over, I hope this series of articles can bring you help. I hope you will see some programming outside of structured programming and be able to skillfully use it in the right place:)
Tomorrow I'll tidy up a catalogue and put it up for easy viewing, and list some articles for reference. Unfortunately, these articles are almost all in English, please study English hard--#