What are the cool things that yield can do?

Source: Internet
Author: User
What interesting, cool, and unexpected things can be done using generators and yield? Unlimited programming languages, such as python and JavaScript. What interesting, cool, and unexpected things can be done using generators and yield?
Unlimited programming languages, such as python and JavaScript. Reply: The most common use of yield in JavaScript is to implement asynchronous operations with Promise/Thunk, such as the well-known tj/co GitHub. So it is no longer an unexpected thing.
After understanding the features of Generator, it is very easy to implement a toy version of co:

function async(generator) {  return new Promise(function(resolve, reject) {    var g = generator()    function next(val) {      var result = g.next(val)      var value = result.value      if (!result.done) {        value.then(next).catch(reject)      }      else {        resolve(value)      }    }    next()  })}
Isn't async/await the most typical?


I don't know how yield implements async/await. Use the C # code for an example:

IEnumerable> SomeAsyncMethod(){  //blabla  yield return await( asyncMethod, context );  //blabla  yield return await( asyncMethod, context );  //blabla}
You can make animations.

#-*-Coding: UTF-8-*-import numpy as npimport matplotlib. pyplot as pltimport matplotlib. animation as animationimport math, random # libraries to be installed: Numpy and Matplotlib. We recommend that you directly use Anacondafig, axes1 = plt. subplots () # Set the axis length axes1.set _ ylim (0, 1.4) axes1.set _ xlim (0, 1 * np. pi/0.01) # Set the initial array x and y x Data = np. arange (0, 2 * np. pi, 0.01) ydata = np. sin (xdata) # obtain line, = axes1.plot (xdata) # burr rate, which increases from 0. The larger the offset, the larger the Burr. offset = 0.0 # because the update parameter calls the data_gen function, therefore, the first default parameter cannot be framenumdef update (data): global offset line. set_ydata (data) return line, #10 random data are generated each time # If the entire graph is changed each time, yield performs def data_gen (): global offset while True: length = float (len (xdata) for I in range (len (xdata): ydata [I] = math. sin (xdata [I]) + 0.2 if I> length/18.0 and I <(length * 2.7/6.0): ydata [I] + = offset * (random. random ()-0.5) offset + = 0.05 # You can set the maximum value of offset if offset> = 0.5: offset = 0.0 yield ydata # After configuration, start playing ani = animation. funcAnimation (fig, update, data_gen, integer = 800, repeat = True) plt. show ()
Is there a more concise and elegant way to simulate discrete events?

Overview-SimPy 3.0.8 documentation This question is for me.

When someone claims that a sandbox is implemented in CPython, yield can be used to tease him, I was looking through the code and saw someone submitted this but didn't run it :...

Cool not working... A Curious Course on Coroutines and Concurrency You can write a concurrent database.
Generator Tricks for Systems Programmers You can write a stream processing framework. See the pdf of David Beazley several times. After reading it, I was shocked. Http://www.dabeaz.com It can be used to train neural networks.
For example, Lasagne/Lasagne · GitHub Sample Code in:

def train(iter_funcs, dataset, batch_size=BATCH_SIZE):    """Train the model with `dataset` with mini-batch training. Each       mini-batch has `batch_size` recordings.    """    num_batches_train = dataset['num_examples_train'] // batch_size    num_batches_valid = dataset['num_examples_valid'] // batch_size    for epoch in itertools.count(1):        batch_train_losses = []        for b in range(num_batches_train):            batch_train_loss = iter_funcs['train'](b)            batch_train_losses.append(batch_train_loss)        avg_train_loss = np.mean(batch_train_losses)        batch_valid_losses = []        batch_valid_accuracies = []        for b in range(num_batches_valid):            batch_valid_loss, batch_valid_accuracy = iter_funcs['valid'](b)            batch_valid_losses.append(batch_valid_loss)            batch_valid_accuracies.append(batch_valid_accuracy)        avg_valid_loss = np.mean(batch_valid_losses)        avg_valid_accuracy = np.mean(batch_valid_accuracies)        yield {            'number': epoch,            'train_loss': avg_train_loss,            'valid_loss': avg_valid_loss,            'valid_accuracy': avg_valid_accuracy,        }
Tornado is a coroutine model implemented by using generator, and uses the event loop to implement high concurrency and traverse the binary tree using the iterator.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.