One, iterator (iterator)
In Python, A For loop can be used for any type in Python, including lists, Ganso, and so on, in fact, the For loop is available for any "iterator object," which is actually an iterator
An iterator is an object that implements an iterator protocol, and the iterator protocol in Python is that an object with the next method advances to the next result, and Stopiteration is thrown at the end of a series of results. Any such object can be iterated with a for loop or other traversal tool in Python, and the iteration tool will invoke the next method at each iteration and catch the stopiteration exception to determine when to leave.
One of the obvious benefits of using iterators is that you only read one piece of data from the object at a time, without causing too much memory overhead.
For example, to read the contents of a file row by line, using the ReadLines () method, we can write:
1 2 |
For line in open ("Test.txt"). ReadLines (): Print Line |
This can work, but not the best way. Because he was actually loading the file into memory at once and then printing it one line at a time. This method has a large memory cost when the file is large.
Using the file iterator, we can write:
1 2 |
For line in open ("Test.txt"): #use file iterators Print Line |
This is the simplest and fastest-running notation, and he does not read the file explicitly, but instead uses the iterator to read one row at a time.
Second, generator (constructor)
The generator function is associated with the concept of an iterator protocol in Python. In short, a function that contains a yield statement is specifically compiled by the genetic builder. When the function is called, they return a generator object that supports the iterator interface. The function may have a return statement, but its purpose is to yield the value.
Unlike normal functions that generate values and exit, the generator functions automatically suspend and pause their execution and state after generating the values, and his local variables will hold state information that will be valid again when the function resumes
1 2 3 4 5 6 7 8 |
>>> def g (n): ... For I in range (n): ... Yield I **2 ... >>> for I in G (5): ... Print I, ":", ... 0:1: 4:9: 16: |
To understand how he works, let's take a look at the next method:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
>>> t = g (5) >>> T.next () 0 >>> T.next () 1 >>> T.next () 4 >>> T.next () 9 >>> T.next () 16 >>> T.next () Traceback (most recent): File "<stdin>", line 1, in <module> Stopiteration |
After running 5 times next, the generator throws a Stopiteration exception, and the iteration terminates.
Let's look at a yield example and generate a Fibonacci sequence with a generator:
1 2 3 4 5 6 7 8 9 10 |
Def FAB (max): A, B = 0,1 While a < max: Yield a A, B = B, a+b
>>> for I in Fab (20): ... Print I, ",", ... 0, 1, 1, 2, 3, 5, 8, 13, |
See here should be able to understand the generator that very abstract concept of it ~ ~
def read_file (Fpath): block_size = 1024x768 with open (Fpath, ' RB ') as F: While True: BLOCK = F.read (block_ SIZE) if block: yield block else: return
Calling the Read () method directly on a file object causes unpredictable memory consumption. A good approach is to use fixed-length buffers to continuously read the contents of the file. With yield, we no longer need to write an iterative class of read files, so we can easily implement file reads.
Python iterators and generators