The multiprocessing module in Python

Source: Internet
Author: User

Multiprocessing. Pipe ([duplex])
Returns 2 Connection objects (CONN1, conn2), which represent both ends of the pipeline, and the default is two-way communication. If DUPLEX=FALSE,CONN1 can only be used to receive messages, CONN2 can only be used to send messages. Unlike Os.open, where Os.pipe () Returns 2 file descriptors (R, W) representing both readable and writable

Examples are as follows:

#!/usr/bin/python#coding=utf-8import osfrom multiprocessing Import Process, Pipedef Send (pipe):    pipe.send ([' Spam ' + [Pipe.close, ' egg '])    def talk (pipe):    pipe.send (dict (name = ' Bob ', spam =))    reply = Pipe.recv () 
   print (' Talker got: ', reply) if __name__ = = ' __main__ ':    (con1, con2) = Pipe ()    sender = Process (target = send, name = ' send ', args = (Con1,))    Sender.start ()    print "Con2 got:%s"% con2.recv () #从send收到消息    con2.close ()    (PA Rentend, Childend) = Pipe () Child    = Process (target = talk, name = ' Talk ', args = (childend,))    Child.start () 
   
    print (' Parent Got: ', Parentend.recv ())    parentend.send ({x * 2 for x in ' spam '})    Child.join ()    print (' Parent exit ')
   

The output is as follows:

Con2 got: [' spam ', ' ' Egg '] (' Parent got: ', {' name ': ' Bob ', ' spam ': ') ' (' Talker Got: ', set ([' SS ', ' AA ', ' pp ', ' mm ']) par ENT exit

  

multiprocessing中使用子进程概念

From multiprocessing import Process

A process can be used to construct a subprocess

p = Process (target=fun,args= (args))

Then start the subprocess by P.start ().

The P.join () method is then used to make the child process run and then the parent process is executed.

From multiprocessing import Processimport os# subprocess to execute code def RUN_PROC (name):    print ' Run child process%s (%s) ... '% (nam E, Os.getpid ()) if __name__== ' __main__ ':    print ' Parent process%s. '% os.getpid ()    p = Process (Target=run_proc, args= (' Test ',))    print ' Process'll start. '    P.start ()    p.join ()    print ' Process end. '

  

Using the pool in multiprocessing

If you need more than a child process, consider using a process pool to manage

from multiprocessing import Pool

From multiprocessing import poolimport OS, timedef long_time_task (name):    print ' Run task%s (%s) ... '% (name, OS.GETPI  D ())    start = Time.time ()    time.sleep (3)    end = Time.time ()    print ' Task%s runs%0.2f seconds. '% (name, (end -start)) if __name__== ' __main__ ':    print ' Parent process%s. '% os.getpid ()    p = Pool () for    I in range (5):        P.apply_async (Long_time_task, args= (i,))    print ' Waiting for all subprocesses done ... '    p.close ()    P.join ()    print ' All subprocesses done. '

  

Pool creates child processes in a different way from the process, by

P.apply_async (func,args=) Implementation, a pool can run at the same time the task depends on the number of CPUs of your computer, such as My computer now has 4 CPUs, the CBD process TASK0,TASK1,TASK2,TASK3 can start at the same time, The TASK4 is not started until one of the previous processes has ended.

The Code of P.close () is to close the process pool, is no longer added to the process, the Pool object call join() method will wait for all child processes to complete, the call must be called join() before close() , after the call close() can not continue to add new Process .

It could also be the instance pool when it defines how much a process

If the above code is P=pool (5) then all the sub-processes can be

Communication between multiple sub-processes

Communication between multiple sub-processes is to use the queue mentioned in the first step, such as the following requirements, one child process to write data to the queue, another process to fetch data from the queue,

#coding: Gbkfrom multiprocessing Import process, Queueimport OS, time, random# code to write the data processing execution: def write (q): for    value in [' A ', ' B ', ' C ']:        print ' Put%s to queue ... '% value        q.put (value)        Time.sleep (Random.random ()) # Read data Process Execution code: DEF Read (q): While    True:        if is Q.empty ():            value = Q.get (True)            print ' Get%s from queue. '% value            TIME.SL Eep (Random.random ())        else:            breakif __name__== ' __main__ ':    # Parent process creates a queue and passes to each sub-process:    q = Queue ()    PW = Process (Target=write, args= (Q,))    PR = Process (Target=read, args= (Q,))    # start subprocess pw, write:    Pw.start ()        # Wait for PW to end:    pw.join ()    # start child process PR, read:    pr.start ()    pr.join ()    # PR process is a dead loop, Cannot wait for it to end, only forcibly terminates:    print    print ' All data is written and read '

  

A few interesting questions about the above code

If __name__== ' __main__ ':        # The parent process creates a queue and passes it to each sub-process:    q = Queue ()    p = Pool ()    pw = P.apply_async (write,args= ( Q,))        PR = p.apply_async (read,args= (q,))    p.close ()    p.join ()       print ' All data is written and read '

  

If the main function is written in the above sample, what I wanted was that I would get a queue and pass it as a parameter to each sub-process in the process pool, but get

Runtimeerror:queue objects should only be shared between processes through inheritance

Error, checked, the main idea is that the queue object cannot communicate between the parent process and the child process, if you want to use the queue in the process pool to use the Multiprocess manager class

If __name__== ' __main__ ':    manager = multiprocessing. The manager ()    # Parent process creates a queue and passes it to each sub-process:    q = Manager. Queue ()    p = Pool ()    pw = P.apply_async (write,args= (q,))    Time.sleep (0.5)    PR = P.apply_async (read, args= (q,))    p.close ()    p.join ()        print    print ' All data is written and read '

  

This allows the queue object to communicate between the parent and child processes, without the need for the manager, and then extend the manager class in multiprocess later.

With regard to the application of locks, if there are simultaneous operations on the same queue between different programs, in order to avoid errors, it is possible to add a lock to a function when the queue is operated so that only one child process can operate on the queue at the same time, and the lock is also locked in the manager object.

#coding: Gbkfrom multiprocessing import process,queue,poolimport multiprocessingimport OS, time, random# write data Process Execution code: DEF Write (Q,lock):    lock.acquire () #加上锁    for value in [' A ', ' B ', ' C ']:        print ' Put%s to queue ... '% value                q.put ( Value)            lock.release () #释放锁  # reads the code that the data process executes: def read (q): While    True:        if not Q.empty ():            value = Q.get ( False)            print ' Get%s from queue. '% value            time.sleep (Random.random ())        else:            breakif __name__== ' __ Main__ ':    manager = multiprocessing. The manager ()    # Parent process creates a queue and passes it to each sub-process:    q = Manager. Queue ()    lock = Manager. Lock () #初始化一把锁    p = Pool ()    pw = P.apply_async (write,args= (q,lock))        PR = p.apply_async (read,args= (q,))    p.close ()    p.join ()        print    print ' All data is written and read '

  

The multiprocessing module in Python

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.