This article mainly introduces Python functional Programming Guide (four): The builder detailed, this article explained the generator introduction, the generator function, the generator function FAQ and so on content, needs the friend to be possible to refer to under
4. Generator (Generator)
4.1. Introduction to Builder
First make sure that the generator is an iterator. The builder owns the next method and behaves exactly like an iterator, which means that the generator can also be used in a python for loop. In addition, the special syntax support for generators makes it much simpler to write a generator than to customize a regular iterator, so the generator is one of the most common features.
Starting with Python 2.5, [PEP 342: Implementing a synergistic program through the enhanced Builder] adds more features to the generator, which means that the builder can do more work. This section will be introduced later in this section.
4.2. Generator function
4.2.1. To define a builder by using a builder function
How do I get a generator? First look at a small piece of code:
The code is as follows:
>>> def get_0_1_2 ():
... yield 0
... yield 1
... yield 2
...
>>> get_0_1_2
We defined a function get_0_1_2, and we can see that this is really a function type. But unlike the general function, the Get_0_1_2 function body uses the keyword yield, which makes get_0_1_2 a generator function. The characteristics of the generator function are as follows:
1. The call generator function returns a generator;
The code is as follows:
>>> generator = Get_0_1_2 ()
>>> Generator
2. The first time the builder's next method is called, the builder begins executing the builder function (not when building the builder) until yield is encountered (pending), and the yield parameter is the return value of the next method;
The code is as follows:
>>> Generator.next ()
0
3. Each time the next method of the generator is called, the builder resumes execution of the generator function from the last paused position until the yield is encountered again, and the same yield argument is the return value of the next method;
The code is as follows:
>>> Generator.next ()
1
>>> Generator.next ()
2
4. If the generator function ends when the next method is called (either encounters an empty return statement or arrives at the end of the function body), the call to this next method throws the Stopiteration exception (that is, the termination condition of the For loop);
The code is as follows:
>>> Generator.next ()
Traceback (most recent call last):
File " ", line 1, in
Stopiteration
5. When the generator function is paused for each execution, all variables in the body of the function will be sealed (freeze) in the builder and will be restored at the time of the restore execution, and similar to the closure, even if the generator returned by the same generator function, the stored variables are independent of each other.
We don't use variables in our little example, so here's another generator to show this feature:
The code is as follows:
>>> def Fibonacci ():
.. A = b = 1
... yield a
... yield b
.. while True:
... a, B = B, a+b
... yield b
...
>>> for Num in Fibonacci ():
. if num > 100:break
... print num,
...
1 1 2 3 5 8 13 21 34 55 89
Don't be surprised to see while true, because the generator can hang, so it's deferred, and infinite loops don't matter. In this example, we define a generator to get the Fibonacci sequence.
FAQ for 4.2.2. Generator functions
Next we'll discuss some interesting topics about generators.
1. In your example, the generator function has no parameters, so can the generator function take parameters?
Of course you can. Pro, and it supports all parameter forms of the function. To know that the generator function is also one of the functions:
The code is as follows:
>>> def counter (start=0):
.. while True:
... yield start
.. Start = 1
...
This is a counter that starts with a specified number.
2. Now that the generator function is also a function, can it return a value using the returns output?
No, it's not true. The generator function already has the default return value-the generator, and you can't give another return value; Yes, not even returns none. However, it can end with an empty return statement. If you insist that you specify a return value for it, then Python presents a syntax error exception in the location defined, like this:
The code is as follows:
>>> def i_wanna_return ():
... yield None
... return None
...
File " ", line 3
SyntaxError: "Return" with argument inside generator
3. Well, that people need to ensure that the release of resources, need to yield in the try...finally, this will be God horse situation? (I just want to play with you) I yield in finally!
Python will execute the finally code when it really leaves try...finally, and here's a pity to tell you that the pause doesn't count! So you can guess the end!
The code is as follows:
>>> def Play_u ():
... try:
... yield 1
... yield 2
... yield 3
... finally:
... yield 0
...
>>> for Val in Play_u (): Print Val,
...
1 2 3 0
* This is different from the return situation. Return is the real exit code block, so the finally clause is executed immediately upon return.
* In addition, "yield in a try block with a finally clause" is defined in PEP 342, which means that only the Python 2.5 version supports this syntax, and a syntax error exception is obtained in the following version of Python 2.4.
4. What if I need to connect to an iteration of another generator during the iteration of the generator? It's so silly and naïve to write the following.
The code is as follows:
>>> def sub_generator ():
... yield 1
... yield 2
... for Val in counter (a): yield Val
...
The syntax improvements for this situation have been defined in the [PEP 380: Delegate to Child builder syntax], which is said to be implemented in Python 3.3, and may also be back to 2.x. Once implemented, you can write this:
The code is as follows:
>>> def sub_generator ():
... yield 1
... yield 2
... yield from counter (10)
File " ", line 4
Yield from counter (10)
^
Syntaxerror:invalid syntax
See Grammar error Wood there? Now we're naïve.
Any more questions? Please reply to this article:
4.3. Collaborative procedures (coroutine)
A collaborative program (coprocessor) generally refers to a function such as:
1. There are different local variables, instruction pointers, but still share global variables;
2. Can be conveniently hung, restored, and there are multiple entry points and exit points;
3. Multiple collaborative programs are performed as collaborative operations, such as the need for B in the operation of a to continue.
The characteristics of the coprocessor determine that only one cooperative program is running at the same time (ignoring multithreading). Thanks to this, the coprocessor can pass objects directly without having to consider resource locks, or directly awaken other threads without active hibernation, like a thread with built-in locks. In an application scenario that conforms to the process characteristics, it is no doubt that using a coprocessor is more convenient than using a thread.
On the other hand, the coprocessor is also limited in its application to a very narrow range, which allows the coprocessor to be more used to compare with the regular functions, rather than with the thread. Of course, threads are a lot more complex and powerful than a coprocessor, so I recommend that you have a good grasp of threads: the Python threading guide
In this section I will not enumerate examples of the process, the following methods to understand.
The Python 2.5 enhancements to the builder implement other features of the coprocessor, in which the builder joins the following methods:
1.send (value):
Send is a method for another recovery builder other than next. In Python 2.5, the yield statement becomes a yield expression, which means that yield can now have a value that calls the parameters of the Send method when the generator's Send method is invoked to resume execution.
The code is as follows:
>>> def repeater ():
.. N = 0
.. while True:
.. N = (yield n)
...
>>> r = Repeater ()
>>> R.next ()
0
>>> R.send (10)
10
* The generator must be in a pending state before calling send passes in a non-none value, otherwise an exception will be thrown. However, a generator that is not started can still call send with none as a parameter.
* If you use the next restore generator, the value of the yield expression will be none.
2.close ():
This method is used to turn off the generator. Calling next or send again after a closed generator throws a Stopiteration exception.
3.throw (Type, Value=none, Traceback=none):
This method is used to throw an exception inside the generator (at the time the generator is currently suspended or at the definition when it is not started).
* Don't be sorry for not seeing the example of a coprocessor, the most common use of the coprocessor is the generator.
4.4. An interesting library: pipe
In this section I would like to give you a brief introduction to pipe. Pipe is not a library built in Python, if you install Easy_install, you can install it directly, otherwise you will need to download it yourself: http://pypi.python.org/pypi/pipe
The reason to introduce this library is because it shows us a very new way to use iterators and generators: streaming. Pipe the data that can be iterated as a stream, similar to the linux,pipe using ' | ' The flow of data is passed, and a series of "stream processing" functions are defined to accept and process the data stream, and eventually output the data stream again or to generalize the data stream to a result. Let's take a look at some examples.
First, very simple, use the add sum:
The code is as follows:
>>> from Pipe import *
>>> Range (5) | Add
10
The number of courtship and the need to use where, the role is similar to the built-in function filter, filter out eligible elements:
The code is as follows:
>>> Range (5) | Where (lambda x:x% 2 = 0) | Add
6
Do you remember our definition of the Fibonacci number generator? Find all even numbers less than 10000 in the sequence and need to use take_while, similar to the Itertools function, intercept elements until the condition is not true:
The code is as follows:
>>> fib = Fibonacci
>>> fib () | Where (lambda x:x% 2 = 0)
... | Take_while (lambda x:x < 10000)
... | Add
3382
You need to apply a function to an element to use a select, which acts like an builtin function map; you need a list to use As_list:
The code is as follows:
>>> fib () | Select (Lambda x:x * * 2) | Take_while (Lambda x:x < 100) | As_list
[1, 1, 4, 9, 25, 64]
The pipe also includes more flow-handling functions. You can even define a stream handler by yourself, simply by defining a generator function and adding a modifier to the pipe. A stream handler function that gets the element until the index does not meet the criteria is defined as follows:
The code is as follows:
>>> @Pipe
... def take_while_idx (iterable, predicate):
... for IDX, x in Enumerate (iterable):
... if predicate (IDX): Yield x
... Else:return
...
Use this flow handler function to get the first 10 digits of the FIB:
The code is as follows:
>>> fib () | Take_while_idx (lambda x:x < 10) | As_list
[1, 1, 2, 3, 5, 8, 13, 21, 34, 55]
More functions are not introduced here, you can view the source files of pipe, a total of 600 lines not to the file where 300 lines are documents, the document contains a large number of examples.
Pipe is very simple to implement, using the pipe adorner, the normal generator function (or function that returns the iterator) is represented on a common class instance that implements the __ror__ method, but this idea is really interesting.
Functional Programming Guide The full text here is all over, I hope this series of articles can help you. I hope you all see some programming outside of structured programming, and you can use it skillfully in the right place: