Iterator and builder advanced usage in Python

Source: Internet
Author: User
iterators

An iterator is an object attached to an iterative protocol-basically means that it has a next method, and when called, returns the next item in the sequence. Throws a (raise) Stopiteration exception when no item can be returned.

Iteration objects allow one loop. It retains the state (position) of a single iteration, or from another perspective, requires an iterative object for Each loop sequence. This means that we can iterate the same sequence at the same time more than once. Separating the iterative logic from the sequence allows us to have more iterative methods.

Calling the __iter__ method of a container (container) to create an iterative object is the most straightforward way to master an iterator. The ITER function saves us some keys.

>>> = [Nums]   # Note that ... Varies:these is different objects>>> iter (nums)              <listite Rator object at ...>>>> nums.__iter__ ()           <listiterator object at ...>>>> nums.__reversed __ ()         <listreverseiterator object at ...>>>> it = iter (nums) >>> Next (IT)      # next (obj) Simply calls Obj.next () 1>>> it.next () 2>>> Next (IT) 3>>> next (IT) Traceback (most recent call Last): File ' <stdin> ', line 1, in <module>stopiteration

When used in a loop, the stopiteration is accepted and the loop is stopped. But by an explicit throw (invocation), we see that once the iterator element is exhausted, accessing it throws an exception.

Using the for...in loop also uses the __iter__ method. This allows us to start transparently to a sequence iteration. But if we already have an iterator, we want to use them in the For loop as well. To achieve this, the iterator has a method __iter__ to return to the iterator itself (self) in addition to next.

Support for iterators in Python is ubiquitous: all sequences and unordered containers in the standard library are supported. This concept is also extended to other things: for example, the file object supports iteration of the row.

>>> f = open ('/etc/fstab ') >>> F is f.__iter__ () True

File itself is an iterator, and its __iter__ method does not create a single object: Sequential reads of only single threads are allowed.

Building an expression
The second way to create an iterative object is by building the expressions (generator expression), which is the basis of list derivation (listing comprehension). To increase clarity, the resulting expression is always encapsulated in parentheses or expressions. If parentheses are used, a build iterator (generator iterator) is created. If it is square brackets, this process is ' shorted ' we get a list of lists.

>>> (I for I-nums)          <generator object <genexpr> at 0x...>>>> [i-I in Nums][1, 2, 3 ]>>> list (I for I in nums) [1, 2, 3]

In Python 2.7 and 3.x, the list expression syntax is extended to both dictionary and set expressions. A set set is created when the build expression is encapsulated by curly braces. A dictionary dict is created when the expression contains a key-value pair in the form of Key:value:

>>> {I for I in range (3)}  set ([0, 1, 2]) >>> {i:i**2-I in Range (3)}  {0:0, 1:1, 2:4}

If you're unlucky enough to be stuck in the old Python version, this syntax is a bit bad:

>>> set (I for I in ' abc ') set ([' A ', ' C ', ' B ']) >>> dict ((I, Ord (i)) = i in ' abc ') {' A ': $, ' C ':, ' B ': 98}

Generating an expression is quite simple, needless to say. There is only one trap worth mentioning: The index variable (i) is leaked in Python version less than 3.

Generator

A generator is a function that produces a column of results rather than a single value.

The third way to create an iterative object is to call the generator function. A generator (generator) is a function that contains the keyword yield. It is worth noting that the mere advent of this keyword completely alters the nature of the function: The yield statement does not have to be raised (invoke) or even be accessible. But let the function become the generator. When a function is called, the instruction is executed. When a generator is called, execution stops before the first instruction. The generator's call creates a generator object that is attached to the iteration protocol. Just like regular functions, concurrent and recursive calls are allowed.
When next is called, the function executes to the first yield. Each time the yield statement gets a value returned as next, the execution of the function is stopped after the yield statement executes.

>>> def f ():  ... Yield 1  ... Yield 2>>> f ()                  <generator object f at 0x...>>>> gen = f () >>> Gen.next () 1>> > Gen.next () 2>>> gen.next () Traceback (most recent call last): File "<stdin>", line 1, in <module>s Topiteration

Let's traverse through the entire process of a single generator function call.

>>> def f ():  ... Print ("--Start--")  ... Yield 3  ... Print ("--Middle--")  ... Yield 4  ... Print ("--Finished--") >>> Gen = f () >>> next (gen)--start--3>>> Next (gen)--Middle--4>> ;> Next (Gen)-              finished--traceback (most recent): ... Stopiteration

As the execution of F () in the normal function immediately lets print execute, Gen does not execute any function in the body of the statement is assigned. only if Gen.next () is called by Next, until the first yield portion of the statement is executed. The second statement prints--middle--and stops execution when a second yield is encountered. The third next prints--finished--and to the end of the function, because there is no yield, an exception is thrown.

What happens after the function yield controls the return to the caller? The state of each generator is stored in the Generator object. From this point the generator functions as if it were running on a separate thread, but this is just an illusion: execution is strictly single-threaded, but the interpreter retains and stores the state between the next value request.

Why is the generator useful? As emphasized in the section on Iterators, generator functions are just another way to create iterative objects. Everything that can be done by the yield statement can also be done by the next method. However, using functions gives the interpreter an advantage in creating an iterator in a magical way. A function can be much shorter than a class definition that requires next and __iter__ methods. More importantly, the author of the generator can more easily understand the statements that are confined to local variables, compared to the instance (instance) attribute that has to be passed between successive next invocations of the iteration object.

The question is why iterators are useful? When an iterator is used to drive a loop, the loop becomes simple. The iterator code initializes the state, determines whether the loop ends, and finds the next value to be extracted to a different place. This highlights the loop body-the most notable part. In addition, iterator code can be reused elsewhere.

Bidirectional communication
Each yield statement passes a value to the caller. This is why PEP 255 is introduced into the generator (implemented in Python2.2). But communication in the opposite direction is also useful. One obvious way is to have some external (extern) statements, or global variables or shared mutable objects. By turning the previously bored yield statement into an expression, direct communication is realistic (implemented in 2.5) because of pep 342. When the generator resumes execution after the yield statement, the caller can invoke a method on the generator object, either pass a value to the generator, then return through the yield statement, or inject an exception into the generator through a different method.

The first new method is send (value), similar to next (), but the value is passed into the generator as the yield expression value. In fact, G.next () and G.send (None) are equivalent.

The second new method is a throw (type, Value=none, Traceback=none), which is equivalent to the yield statement

Raise type, value, Traceback

Unlike raise (which immediately throws an exception from the execution point), throw () restores the generator first and then only throws an exception. Selecting a single throw is because it means placing an exception in another location and in other languages associated with the exception.

What happens when an exception in the generator is raised? It can or explicitly be thrown, and can be injected into the yield statement by the throw () method when certain statements are executed. In either case, the exception is propagated in a standard way: it can be captured by except and finally, or it causes the generator to abort and pass it to the caller.

For completeness, it is worth mentioning that the generator iterator also has the close () method, which is used to abort the generator that could have provided more values immediately. It destroys the object that holds the generator state with the generator's __del__ method.

Let's define a generator that prints only what is passed through the send and throw methods.

>>> Import itertools>>> def g ():   ... print '--start--'   ... For I in Itertools.count (): ...     print '--yielding%i--'% i     ... Try:       ... Ans = yield I     ... Except Generatorexit:       ... print '--closing--'       ...     raise ... Except Exception as E:       ... print '--yield raised%r--'% E ...     else:       ... print '--yield returned%s--'% ans>>> it = g () >>> Next (IT)--start----yielding 0--0>>> it.send (one)--yield returned----yielding 1--1>>> It.throw (indexerror)--yield raised indexerror ()----yielding 2--2 >>> it.close ()--closing--

Note: Next or __next__?

In Python 2.x, the iterator method that accepts the next value is next, which is explicitly called by the global function next, meaning that it should call __next__. Just like the global function ITER calls __ITER__. This inconsistency was fixed in Python 3.x and It.next became it.__next__. The--send and throw scenarios for other generator methods are more complex because they are not implicitly invoked by the interpreter. However, it is recommended that the syntax extension let continue take a parameter that will be passed to the send in the loop iterator. If this extension is accepted, gen.send may become gen.__send__. The last generator method, close, is clearly named incorrectly because it has been implicitly called.

Chained generators
Note: This is a preview of Pep 380 (not yet implemented, but has been accepted by Python3.3)

For example, we are writing a generator, we want to yield a second generator--a child generator (subgenerator)--the number generated. If only the yield value is considered, the loop can be done effortlessly:

Subgen = Some_other_generator () for V in Subgen:  yield v

However, if the child generator needs to call Send (), throw () and close () and the caller to interact appropriately, things are complicated. The yield statement has to guarantee the Debug builder function through a try...except...finally structure similar to the one in the previous section. This code is provided in PEP 380 and is now enough to come up with the new syntax that will be introduced in Python 3.3:

Yield from Some_other_generator ()

The

, like the explicit circular call above, repeats the value from Some_other_generator until no value can be generated, but still forwards send, throw, and close to the child generator.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.