It was found that python has a standard function similar to the map-Reduce concept in hadoop, so it was quite interesting to come and play. Although its functionality was simple, it did everything.
Map (func, * iterables) --> map object make an iterator that computes the function using arguments from each of the iterables. stops when the shortest iterable is exhausted.
Func is a function. The number of parameters of this function is determined by the number of iterables. Each element in iterables is called as a parameter once and the result is returned. That is to say, the number of times that func is called will be returned.
The implementation of the map is a generator, that is, the callback function is called to return the result only after _ next _ () is called.
Def func (x, y): Return x * y * 2 list = [1, 2, 3, 4, 5] result = map (func, list, list) print (result. _ next _ () for R in result: Print (r)
Result: 2 8 18 32 50
In fact, we can also implement a version of the map function:
Def map (func, * iters): for it in zip (* iters): yield func (* It) # asterisks *, it indicates that each element of the IT tuples must be used as multiple parameters, rather than the entire list as a parameter.
Note: The above records indicate that python 3.2 has passed the test and Python 3 and later versions have passed the test.Apply (), Callable (), exefile (), file (), reduce (), reload () and other methods are removed.