What are the interesting, cool, and unexpected things you can do with the generator (Generator) and yield?
Unlimited programming languages, such as Python, JavaScript, and more.
Reply content:
One of the most likely uses of yield in JavaScript is to implement asynchronous operations such as Promise/thunk, such as the famous Tj/co GitHub
, so it's not an "unexpected" thing anymore.
After understanding the characteristics of Generator, it is simple to implement a toy version of CO:
function async(generator) { return new Promise(function(resolve, reject) { var g = generator() function next(val) { var result = g.next(val) var value = result.value if (!result.done) { value.then(next).catch(reject) } else { resolve(value) } } next() })}
The most typical is not async/await?
Do not understand yield how to achieve async/await, use C # code to try to give an example:
IEnumerable> SomeAsyncMethod(){ //blabla yield return await( asyncMethod, context ); //blabla yield return await( asyncMethod, context ); //blabla}
Can do animation ah, effect
#-*-Coding:utf-8-*-Import NumPy as NPImport Matplotlib.pyplot as PLTImport matplotlib.animation as AnimationImport Math, Random# Libraries to install: NumPy and matplotlib, recommended direct AnacondaFig, axes1 = PLT.Subplots()# Set the axis lengthaxes1.Set_ylim(0, 1.4)axes1.Set_xlim(0, 1*NP.Pi/0.01)# Set initial x, y array of valuesXData = NP.Arange(0, 2*NP.Pi, 0.01)Ydata = NP.Sin(XData)# Get Lines Line, = axes1.plot(XData)# burr magnification, starting from 0, the greater the size of the offset .Offset = 0.0#因为update的参数是调用函数data_gen, so the first default parameter cannot be a framenumdef Update(Data): Global Offset Line.Set_ydata(Data) return Line,# Generate 10 random data at a time# Every time you change the whole picture, yield a whole picture.def Data_gen(): Global Offset while True: length = float(Len(XData)) for I inch Range(Len(XData)): Ydata[I]=Math.Sin(XData[I])+0.2 if I>length/18.0 and I<(length*2.7/6.0): Ydata[I]+=Offset*(Random.Random()-0.5) Offset += 0.05 #可以设置offset的最大值 if Offset>=0.5: Offset=0.0 yield Ydata# Configuration complete, start playbackANI = Animation.funcanimation(Fig, Update, Data_gen, interval= -, Repeat=True)PLT.Show()
Simulation of discrete events, and a more concise and elegant way?
Overview-simpy 3.0.8 Documentation
This is a question for me.
When someone claims to have achieved a sandbox in CPython, you can use yield to tease him, I was looking through the code and saw someone submitted this but didn ' t run it: ...
Cool to no work ... A Curious Course on Coroutines and Concurrency
Can write a concurrent library
Generator Tricks for Systems programmers
You can write a flow-processing framework see David Beazley several times Pycon PDF, I was shocked to read it. http://www. dabeaz.com
Can be used to train neural networks.
Like Lasagne/lasagne. GitHub
A sample code in:
def Train(Iter_funcs, DataSet, batch_size=batch_size): "" "Train the Model with a ' dataset ' with Mini-batch training. eachMini-batch has ' batch_size ' recordings. """ Num_batches_train = DataSet[' Num_examples_train '] // batch_size Num_batches_valid = DataSet[' Num_examples_valid '] // batch_size for Epoch inch Itertools.Count(1): batch_train_losses = [] for b inch Range(Num_batches_train): Batch_train_loss = Iter_funcs[' Train '](b) batch_train_losses.Append(Batch_train_loss) Avg_train_loss = NP.mean(batch_train_losses) batch_valid_losses = [] batch_valid_accuracies = [] for b inch Range(Num_batches_valid): Batch_valid_loss, batch_valid_accuracy = Iter_funcs[' valid '](b) batch_valid_losses.Append(Batch_valid_loss) batch_valid_accuracies.Append(batch_valid_accuracy) Avg_valid_loss = NP.mean(batch_valid_losses) avg_valid_accuracy = NP.mean(batch_valid_accuracies) yield { ' number ': Epoch, ' Train_loss ': Avg_train_loss, ' Valid_loss ': Avg_valid_loss, ' Valid_accuracy ': avg_valid_accuracy, }
Tornado is the use of the generator implementation of the co-process (Coroutine) model, coupled with the event loop to achieve high concurrency using iterators to traverse the binary tree.