11.python Concurrency Primer (part10 Multiple processes for communication, and data sharing between processes)

Source: Internet
Author: User
Tags semaphore

One, process queue.

Multiple processes to manipulate the data in a queue, look like a process queue, just a queue, single in fact, how many processes do you open, once these processes go to use this queue, then the queue will be copied how many copies.

(Queue = pipe + Lock)

The main reason for this is that data between different processes cannot be shared.

The following is an example of using process queues to communicate with each other on multiple processes:

The following example is the put content in the process queue.

#!/usr/local/bin/python2.7

#-*-Coding:utf-8-*-

Import multiprocessing

def func1 (que,n):

Que.put (n+1)

Print "Son process queue ID is%s:"% (ID (que))

if __name__ = = ' __main__ ':

Q1 = multiprocessing. Queue () #生成一个进程队列!!! This queue is different from normal queue!!

Print "main process queue ID is%s:"% (ID (Q1))

For I in range (3):

p = multiprocessing. Process (target=func1,args= (q1,i))

P.start ()

Print Q1.get ()

Print Q1.get ()

Print Q1.get ()


Output Result:

#windows和类unix系统运行的结果是不一样的!! Under Windows system, the ID number of the process queue is different!!

Main process Queue ID is 4459325136:

Son process queue ID is 4459325136:

1

Son process queue ID is 4459325136:

2

Son process queue ID is 4459325136:

3


Second, through the pipeline to achieve inter-process communication.

The following is an example of how to use a pipeline to communicate with two processes.

The following code mainly implements the parent process to send messages to the child process to each other.

#!/usr/local/bin/python2.7

#-*-Coding:utf-8-*-

Import multiprocessing

DEF FUNC1 (conn):

Conn.send ("Hi daddy!") #子进程给父进程发送内容.

Response = CONN.RECV () #子进程等待接收父进程发来的内容.

Print "respnse:%s"% (response)

Conn.close ()

print "Child conn ID%s"% (ID (child_conn))

if __name__ = = ' __main__ ':

Parent_conn,child_conn = multiprocessing. Pipe () #使用multiprocessing的pipe类来创建出两个双向的管道对象, note Oh, these two pipelines are two-way, can be received also can be sent.

print "main Child conn:%s"% (ID (child_conn))

p = multiprocessing. Process (target=func1,args= (Child_conn,))

P.start ()

Print Parent_conn.recv () #父进程等待接收子进程的内容.

Parent_conn.send ("Hi child!") #父进程给子进程发送内容.

P.join ()

The two pipes generated using the pipe class correspond to two telephones, one to the parent process and the other to the child process.

The official statements are as follows:

The connection objects returned by pipe () represent the pipe. Each connection object has a Send () and recv () methods (among others). Note that data in a pipe may become corrupted if and processes (or threads) try to read from or write to the same end of T He pipe at the same time. Of course there is no risk of corruption from processes using different ends of the pipe at the same time.

The pipe object creates two linked objects (pipelines), each of which can send (send), recv (receive), and so on at each end of the linked object (pipe).

But it's important to note that! Two processes or threads at the same time, to read or write to the same end of the pipeline, then the data is likely to be corrupted!!


Summarize the relationship between the process queue and the pipeline:

Pipelines and process queues, which are implemented between two sub-processes, and the communication function between the parent and child processes, they do the same thing, but the implementation of the method is not the same, they complete only the process and process communication between, note!! Just data Communication!! Rather than data sharing! Don't confuse the concept!

So what is real data sharing?

When a process modifies a data, another process internally changes, which is the real sense of data sharing between multiple processes.

In the case of a better understanding, if a list contains three elements, process 1 goes to the list to delete an element, and process 2 goes to print the list, there are only two elements left in the list.


So how do you implement data sharing between processes? This is going to be accomplished by the manager.


Third, use Manager to realize the data sharing between process and process.

Queues and pipelines only implement data interaction and do not share data, and sharing is a process to change the data of another process.

Official Introduction to Managers:

A manager object returned by manager () controls a server process which holds Python objects and allows other processes to manipulate them using proxies.

Manager () Returns a Manager object that controls a python service process and allows other processes to manipulate them using proxies


Which data types are supported by the Manger object?

A Manager returned by manager () would support types list, dict, Namespace, Lock, Rlock, Semaphore, Boundedsemaphore, Condit Ion, Event, Barrier, Queue, Value and Array. For example:


list, dictionary, namespace (variable), lock, recursive lock, semaphore, condition variable, event, queue ... etc... Because some types of bloggers have not been contacted ...


The following is an example of how data sharing is implemented between processes:

#!/usr/local/bin/python2.7

#-*-Coding:utf-8-*-

Import multiprocessing

def func1 (dic_1,list_1,name):

Dic_1[name] = ' Test '

#dic_1 [' num '] = 123

List_1.append (name)

print "son process dict id:%s list ID:%s"% (ID (dic_1), ID (list_1))

if __name__ = = ' __main__ ':

With multiprocessing. Manager () as manager:

D1 = manager.dict () #注意这里!! If the data types, such as dictionaries and lists, want to be shared between processes, you must use the classes packaged under the manager!!

L1 = manager.list () #这里使用的也是manager中已经封装好的特殊list!

Print "main process dict id:%s list ID:%s"% (ID (d1), ID (L1))

Pro_list = []

For I in range (10):

p = multiprocessing. Process (target=func1,args= (D1,l1,str (i),))

P.start ()

Pro_list.append (P)

For res in pro_list:

Res.join ()

Print D1

Print L1




This article is from the "Rebirth" blog, make sure to keep this source http://suhaozhi.blog.51cto.com/7272298/1925925

11.python Concurrency Primer (part10 Multiple processes for communication, and data sharing between processes)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.