In this paper, the Python iterator and generator are illustrated in detail, as follows:
1. Iterator Overview:
Iterators are a way to access the elements of a collection. The iterator object is accessed from the first element of the collection until all the elements have been accessed and finished. Iterators can only move forward and not back, but that's fine, because people rarely go backwards in an iterative way.
1.1 Advantages of using iterators
For native support random access data structures (such as tuple, list), the iterator and the classic for loop index Access does not have the advantage, but the index value is lost (you can use the built-in function enumerate () to retrieve the index value). But for data structures that cannot be randomly accessed (such as set), iterators are the only way to access elements.
In addition, one of the great advantages of iterators is that it does not require that all elements of the entire iteration be prepared beforehand. An iterator evaluates the element only when it is iterated to an element, and before or after that, the element may not exist or be destroyed. This feature makes it especially useful for traversing a large or infinite set, such as a few G files, or Fibonacci sequences, and so on.
The greater credit for iterators is the provision of a unified interface to access the collection, which can be accessed using iterators as long as the __iter__ () method object is defined.
The iterator has two basic methods
Next method: Returns the next element of the iterator
__iter__ Method: Returns the Iterator object itself
The following is an example of generating Fibonacci numbers to illustrate why iterators
Example code 1
Def FAB (max):
N, a, b = 0, 0, 1 while
n < max:
print B
A, B = B, a + b
n = n + 1
Printing directly in the function fab (max) causes the function to become less reusable because the Fab returns none. Other functions cannot obtain the sequence returned by the FAB function.
Example code 2
Def FAB (max):
L = []
n, a, b = 0, 0, 1 while
n < max:
l.append (b)
A, B = B, a + b
n = n + 1
return L
Code 2 satisfies the need for reusability, but takes up memory space, preferably not.
Example code 3
Contrast:
For I in range (1000): Pass for
i in xrange (1000): Pass
The previous one returns a list of 1000 elements, and the latter returns an element in each iteration, so you can use iterators to solve the problem of reusable space
Class Fab (object):
def __init__ (self, max):
Self.max = Max
SELF.N, self.a, self.b = 0, 0, 1
def __iter __ (self): return
self
def next (self):
if SELF.N < self.max:
r = self.b
self.a, self.b = self . B, SELF.A + self.b
SELF.N = SELF.N + 1 return
R
raise Stopiteration ()
Perform
>>> for key in Fabs (5):
print key
The Fabs class continuously returns the next number of series through next (), and the memory footprint is always constant
1.2 Using iterators
Use the built-in factory function iter (iterable) to get the iterator object:
>>> LST = range (5)
>>> it = iter (LST)
>>> it
<listiterator object at 0x01a63110>
Use the next () method to access the next element:
>>> it.next ()
>>> it.next ()
>>> It.next ()
Python handles iterators out of bounds is thrown stopiteration exception
>>> it.next ()
>>> it.next
<method-wrapper ' Next ' of Listiterator object at 0x01a63110>
>>> it.next ()
>>> it.next ()
Traceback (most recent call last):
File "<pyshell# 27> ", line 1, in <module>
it.next ()
stopiteration
With Stopiteration, you can use iterators to iterate through the
LST = range (5)
it = iter (LST)
try: While
True:
val = it.next ()
print Val
except Stopiteration: Pass
In fact, because the iterator is so pervasive, Python does the syntax of the iterator for the FOR keyword. In the For loop, Python automatically invokes the factory function iter () to get the iterator, automatically call next () to get the element, and also completes the work of checking stopiteration exceptions. As follows
>>> a = (1, 2, 3, 4)
>>> for key in a:
print key
First Python invokes the ITER function iterator on the keyword in object, and then calls the iterator's next method to get the element until the stopiteration exception is thrown.
1.3 Defining iterators
The following example--Fibonacci series
#-*-coding:cp936-*-
class Fabs (object):
def __init__ (Self,max):
Self.max = Max
SELF.N, SELF.A, self.b = 0, 0, 1 #特别指出: The No. 0 item is 0, and the 1th item is the first 1. The entire sequence starts from 1
def __iter__ (self): return
self
def next (self):
if SELF.N < self.max:
r = self.b
self.a, self.b = self.b, SELF.A + self.b
SELF.N = SELF.N + 1 return
r< C13/>raise stopiteration ()
print fabs (5) for
key in Fabs (5):
print key
Results
<__main__. Fabs object at 0x01a63090>
2. iterators
Functions with yield are called generator (generators) in Python, with a few examples (or a Fibonacci sequence to illustrate).
You can see that code 3 is far from Code 1 concise, the generator (yield) can maintain the simplicity of the code 1, but also to maintain the effect of code 3
Example code 4
Def FAB (max):
N, a, b = 0, 0, 1 while
n < max:
yield b
A, B = B, a + b
= n = 1
Perform
>>> for N in Fab (5):
print N
Simply put, the function of yield is to make a generator, a function with yield is no longer a normal function, the Python interpreter treats it as a generator, the call FAB (5) does not execute the FAB function, but instead returns a Iterab Le Object! When the For loop executes, each loop executes the code inside the FAB function, when executed to yield B, the FAB function returns an iteration value, and the next iteration, the code continues from the next statement in yield B, and the function's local variable looks exactly the same as before the last interrupt execution, so the function Continue execution until you meet yield again. It looks as if a function was interrupted several times by yield during normal execution, with each interrupt returning the current iteration value through yield.
You can also manually invoke the next () method of the Fab (5) (Because Fab (5) is a generator object that has the next () method) so that we can see the fab execution process more clearly:
>>> f = Fab (3)
>>> f.next ()
1
>>> f.next ()
1
>>> f.next ()
2
>>> f.next ()
Traceback (most recent call last):
File "<pyshell#62>", line 1, <module>
f.next ()
stopiteration
Return action
In a generator, if there is no return, the default is done to the function, and if you encounter return, if you return during execution, throw stopiteration to terminate the iteration directly. For example
>>> s = Fab (5)
>>> s.next ()
1
>>> s.next ()
Traceback (most recent call last) :
File "<pyshell#66>", line 1, in <module>
s.next ()
stopiteration
Sample Code 5 File read
def read_file (Fpath):
block_size = 1024
with open (Fpath, ' RB ') as F: While
True: Block
= F.read ( Block_size)
if block:
yield blocks
else:
return
If you call the Read () method directly on a file object, it can cause unpredictable memory consumption. A good approach is to use a fixed-length buffer to continuously read the contents of the file. With yield, we no longer need to write an iterative class of read files, so we can easily implement file reading.