Python co-process

Source: Internet
Author: User
Tags generator

Iterators
    • Iterative (iterable): Directly acting on a For loop variable
    • Iterator (Iterator): Directly acting on the For loop variable, and can be called with Next
    • Identification, availableisinstancle()
Generator
    • No for, small memory, one side of the loop calculation--time to change space

    • Next function call, to last, report StopIteration exception

    • Generated:

      1. Direct use
      g = (x * x for x in range(10)) # 中括号是列表生成器,小括号是生成器
      1. The function contains yield, which is called the generator, next calls the function, returns with yield
      def odd():    print('step 1')    yield 1    print('step 2')    yield(3)    print('step 3')    yield(5)g = odd()one = next(g)print(one)two = next(g)print(two)three = next(g)print(three)

      Step 1
      1
      Step 2
      2
      Step 3
      3

      Note: At this point G is a generator, and next call starts after the last yield

      1. For Loop call Builder
      def fib(max):    n, a, b = 0, 0, 1    while n < max:        yield b        a, b = b, a+b        n += 1    return 'Done'g = fib(5)for i in g:    print(i)

      1
      1
      2
      3
      5

Co-process
  • Definition: To create subroutines for non-preemptive multitasking, you can pause execution-like generator

  • Keywords: yield andsend

    def simple_coroutine(a):    print('-> start')    b = yield a    print('-> recived', a, b)    c = yield a + b    print('-> recived', a, b, c)# runcsc = simple_coroutine(5)aa = next(sc)  # 预激print(aa)bb = sc.send(6)  # 5, 6print(bb)cc = sc.send(7)  # 5, 6, 7print(cc)

    -Start
    5
    -Recived 5 6
    11
    -Recived 5 6 7

    Analysis: Moving the brain

  • The process terminates: bubbles up, sends the value of the whistle, lets the association exit

  • Yield from: equivalent to plus one channel (between the association and the main thread)

    def gen():    for c in 'AB':        yield cprint(list(gen()))def gen_new():    yield from 'AB'print(list(gen_new()))
  • Delegation Builder: Generator function containing yield from

      from collections import Namedtupleresclass = Namedtuple (' Res ', ' Count Average ') # sub-generator def            Averager (): total = 0.0 Count = 0 average = none and True:term = yield if term is None: Break Total + = Term count + = 1 average = Total/count return Resclass (count, average) # delegate build Device def grouper (storages, key): While True: # gets Averager () return value storages[key] = yield from Averager () # client code D  EF Client (): Process_data = {' boys_2 ': [39.0, 40.8, 43.2, 40.8, 43.1, 38.6, 41.4, 40.6, 36.3], ' boys_1 ':    [1.38, 1.5, 1.32, 1.25, 1.37, 1.48, 1.25, 1.49, 1.46]}        storages = {} for K, V in Process_data.items (): # get co-coroutine = Grouper (storages, K) # Pre-excitation co-process Next (coroutine) # sends data to the coprocessor for DT in v:coroutine.send (DT) # to terminate the coprocessor Coroutine.send (None) print (storages) # runclient ()  

    {' Boys_2 ': Res (count=9, average=40.422222222222224), ' Boys_1 ': Res (count=9, average=1.3888888888888888)}

    Explain:

      1. client()function starts, for K, v Loop, each time a new grouper instance is created Coroutine
      2. next(coroutine)Pre-excitation process, enter while True loop, call averager() , yield from at place pause
      3. The for dt in v grouper instance is still paused after the inner layer finishes, so the assignment for Storages[key] is not completed
      4. coroutine.send(None), the term becomes the None,averager child generator aborts, throws Stopiteration, and contains the returned data in the exception object's value, the yield from directly fetching the stopitration, assigning the value of the exception object to the Storages[key]
Asyncio
    • Step: Create a message loop (to resolve asynchronous Io, there is a relay: equivalent to mailbox, message queue), import co-process-close

      import threadingimport asyncio@asyncio.coroutinedef hello():    print('Hello world! (%s)' % threading.currentThread())    print('Starting......(%s)' % threading.currentThread())    yield from asyncio.sleep(3)    print('Done......(%s)' % threading.currentThread())    print('Hello again! (%s)' % threading.currentThread())loop = asyncio.get_event_loop()tasks = [hello(), hello()]loop.run_until_complete(asyncio.wait(tasks))loop.close()
Async & Await
    • More concise, without adorners

      import threadingimport asyncioasync def hello():    print('Hello world! (%s)' % threading.currentThread())    print('Starting......(%s)' % threading.currentThread())    await asyncio.sleep(3)    print('Done......(%s)' % threading.currentThread())    print('Hello again! (%s)' % threading.currentThread())loop = asyncio.get_event_loop()tasks = [hello(), hello()]loop.run_until_complete(asyncio.wait(tasks))loop.close()
Aiohttp
    • Introduced:

      • --http with Asyncio and coroutine is IO operation
    • Cases:

      import asynciofrom aiohttp import webasync def index(request):    await asyncio.sleep(0.5)    return web.Response(body=b'

      Note: Check + understand

Concurrent.futures
  • Similar thread pool

  • Real parallel computing with multiprocessing--run multiple interpreters

  • Concurrent.furtures.Executor

    • Threadpoolexecutor
    • Processpoolexecutor
  • Example:

    from concurrent.futures import ThreadPoolExecutorimport timedef return_future(msg):    time.sleep(3)    return msg# 创建一个线程池pool = ThreadPoolExecutor(max_workers=2)# 往线程池加入2个taskf1 = pool.submit(return_future, 'hello')f2 = pool.submit(return_future, 'world')print(f1.done())time.sleep(3)print(f2.done())print(f1.result())print(f2.result())
  • Map (FN, *iterables, Timeout=none):

    Map and submit with one on the line

    import timeimport reimport osimport datetimefrom concurrent import futuresdata = ['1', '2']def wait_on(argument):    print(argument)    time.sleep(2)    return "ok"ex = futures.ThreadPoolExecutor(max_workers=2)for i in ex.map(wait_on, data):    print(i)
  • Future

    • The future instance is Executor.submit created by
    from concurrent.futures import ThreadPoolExecutor as Poolfrom concurrent.futures import as_completedimport requestsURLS = ['http://qq.com', 'http://sina.com', 'http://www.baidu.com', ]def task(url, timeout=10):    return requests.get(url, timeout=timeout)with Pool(max_workers=3) as executor:    future_tasks = [executor.submit(task, url) for url in URLS]    for f in future_tasks:        if f.running():            print('%s is running' % str(f))    for f in as_completed(future_tasks):        try:            ret = f.done()            if ret:                f_ret = f.result()                print('%s, done, result: %s, %s' % (str(f), f_ret.url, len(f_ret.content)))        except Exception as e:            f.cancel()            print(str(e))

Python co-process

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.