Tips for implementing the iterator in Python programming, python Summary
Yield implementation iterator
As described in the introduction, to implement an iterative function, if you manually implement iter every time, next is a little troublesome and the required code is objective. In python, yield can also be used to implement an iterator. Yield has a key function, which can interrupt the current execution logic, maintain the field (status of various values, execution location, etc.), and return the corresponding value, the next execution can seamlessly continue with the last execution, so that the loop repeatedly knows that the exit conditions set in advance or an error occurs and is forcibly interrupted.
The specific function is that when return is used, a value is returned from the function. The difference is that after yield is returned, the function can continue to be executed from the place returned by yield. That is to say, yield returns the function and sends it to the caller for a return value. Then, it "instantly moves" back to let the function continue to run until a yield statement returns a new value. After yield is returned, the caller actually obtains an iterator object. The iterator value is the returned value, and the next () of the iterator is called () the method will cause the function to resume the execution environment of the yield statement and continue running until the next yield is encountered. If yield is not found, an exception will be thrown, indicating that the iteration ends.
Let's look at an example:
>>> def test_yield():... yield 1... yield 2... yield (1,2)...>>> a = test_yield()>>> a.next()1>>> a.next()2>>> a.next()(1, 2)>>> a.next()Traceback (most recent call last): File "<stdin>", line 1, in ?StopIteration
Just listen to the description and you will find that it works in the same way as the iterator, right? Indeed, yield can change its function to an iterator, let's take a look at the most classic fiboatchq series to briefly describe how it works:
def fab(max): n, a, b = 0, 0, 1 while n < max: print b, "is generated" yield b a, b = b, a + b n = n + 1 >>> for item in fab(5):... print item... 1 is generated11 is generated12 is generated23 is generated35 is generated5
Let's look back at the syntax sugar of the for keyword. When we traverse the final number of values within 5, it is clear that fab (5) has generated an iteratable object, at the beginning of traversal, its iter method is called to return a actually working iterator object, and each time the next method is called to return a final number of values and print them out.
We can print out the attributes of the objects returned by calling the generator function to see what happened:
>>> temp_gen = fab(5)>>> dir(temp_gen)['__class__', '__delattr__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__iter__', '__name__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'close', 'gi_code', 'gi_frame', 'gi_running', 'next', 'send', 'throw']
As described above, simply calling fab does not allow the function to immediately return any value, and can be seen from the printed attribute list of fab (5, the objects returned by the generator function include the implementation of _ iter __, next. Compared with our manual implementation, yield can easily implement the functions we want and reduce the amount of code.
Generator Expression
In python, another method that can generate iterator objects more elegantly is to use the Generator expression, which is similar to the list parsing expression and only converts brackets [] () but the actual results produced by small changes are indeed quite different:
>>> temp_gen = (x for x in range(5))>>> temp_gen<generator object <genexpr> at 0x7192d8>>>> for item in temp_gen:... print item... 01234>>> dir(temp_gen)['__class__', '__delattr__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__iter__', '__name__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'close', 'gi_code', 'gi_frame', 'gi_running', 'next', 'send', 'throw']
I have read the description of yield. This example and the corresponding output log are quite straightforward, whether it is the description of the printed logs of temp_gen, the output result returned by the for statement traversal is the list of properties output by calling the dir statement, which means that the generator expression indeed generates objects that support iteration. In addition, functions can be called in the expression to add appropriate filtering conditions.
Built-in libraries itertools and iter
The built-in python library itertools provides a large number of tool methods. These methods can help us create objects that can be efficiently traversed and iterated. They contain many interesting and useful methods, such as chain, izip/izip_longest, combinations, ifilter, and so on. There is also a built-in iter function in python that is very useful. It can return an iterator object, and then you can view the corresponding help document for a brief look:
>>> iter('abc')<iterator object at 0x718590>>>> str_iterator = iter('abc')>>> next(str_iterator)'a'>>> next(str_iterator)'b'>>> lst_gen = iter([1,2,3,4])>>> lst_gen<listiterator object at 0x728e30>>>> dir(lst_gen)['__class__', '__delattr__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__iter__', '__length_hint__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'next']>>> help(iter)Help on built-in function iter in module builtins:iter(...) iter(iterable) -> iterator iter(callable, sentinel) -> iterator Get an iterator from an object. In the first form, the argument must supply its own iterator, or be a sequence. In the second form, the callable is called until it returns the sentinel.