Multi-process application of python--9 and concurrency

Source: Internet
Author: User
Tags terminates

Multiprocessing Module

To fully use the resources of multicore CPUs (Os.cpu_count () Viewing), most cases in Python require multiple processes. Python provides the multiprocessing.
The multiprocessing module is used to open sub-processes and perform our custom tasks (such as functions) in the subprocess, which is similar to the programming interface of the Multithreaded module threading.

The multiprocessing module has many functions: supporting sub-processes, communicating and sharing data, performing different forms of synchronization, and providing components such as process, Queue, Pipe, lock, etc.

One thing that needs to be emphasized again is that, unlike threads, processes do not have any shared state, the process modifies data, and changes are limited to that process.

Process class

to create a process class :

Process ([group [, Target [, name [, args [, Kwargs]]]), an object instantiated by the class that represents a task in a child process (not yet started) emphasizes: 1. You need to use a keyword to specify parameter 2. args Specifies the positional parameter to be passed to the target function, which is a tuple form and must have a comma

Parameters:
The group parameter is not used, and the value always is none target represents the calling object, that is, the task that the child process is executing, the args represents the location of the calling object, args= (' Egon ',) Kwargs represents the dictionary of the calling object, kwargs={' name ': ' Egon ', ' age ': The name of the child process} name

Method
P.start (): Starts the process and calls the P.run () P.run () in the subprocess: The method that runs at the start of the process, is exactly what it calls the function specified by the target, and we must implement the method in the class of the custom Class p.terminate (): Force terminates the process p, does not do any cleanup operations, if p creates a child process, the child process is a zombie process, using this method requires special care of this situation. If P also holds a lock then it will not be released, resulting in Deadlock p.is_alive (): If P is still running, return truep.join ([timeout]): The main thread waits for p to terminate (emphasis: is the main thread is in the state, and P is in the running state). Timeout is an optional time-out, and it should be emphasized that the p.join can only join the START process and not join the run-open process

Property:
P.daemon: The default value is False, if set to true, which means that P is running in the background daemon, when P's parent process terminates, p also terminates, and set to True, p cannot create its own new process, must be set before P.start () P.name: Process name P.pid: Process Pidp.exitcode: The process is none at run time, if –n, indicates the end of signal N (understanding) P.authkey: The process's authentication key, which by default is a 32-character string randomly generated by os.urandom (). The purpose of this key is to provide security for the underlying interprocess communication that involves a network connection, which can only succeed if you have the same authentication key (learn)

Use of the Process class

Must be placed in the IF __name__ = = ' __main__ when used in Windows

Two ways to create and open a child process

#开进程的方法一: Import timeimport randomfrom multiprocessing import processdef Piao (name):    print ('%s piaoing '%name)    Time.sleep (Random.randrange (1,5))    print ('%s Piao end '%name)
P1=process (target=piao,args= (' Egon ')) #必须加, number p2=process (target=piao,args= (' Alex '),) p3=process (target=piao,args= (' Wupeqi ',)) P4=process (target=piao,args= (' Yuanhao ',)) P1.start () P2.start () P3.start () P4.start () print (' main thread ')
#开进程的方法二: Import timeimport randomfrom multiprocessing import processclass Piao (Process):    def __init__ (self,name):        super (). __init__ ()        self.name=name    def run (self):        print ('%s piaoing '%self.name)        time.sleep ( Random.randrange (1,5))        print ('%s Piao end '%self.name) P1=piao (' Egon ') P2=piao (' Alex ') P3=piao (' Wupeiqi ') P4=piao (' Yuanhao ') P1.start () #start会自动调用runp2. Start () P3.start () P4.start () print (' main thread ')

The direct memory space of the process is isolated

Join method for Process object
From multiprocessing import Processimport timeimport randomclass Piao (Process):    def __init__ (self,name):        Self.name=name        Super (). __init__ ()    def run (self):        print ('%s is piaoing '%self.name)        time.sleep ( Random.randrange (1,3))        print ('%s is Piao end '%self.name) P=piao (' Egon ') P.start () p.join (0.0001) #等待p停止, Wait 0.0001 seconds to stop waiting for print (' Start ')


Daemon process

Master Process Creation Daemon

One: The daemon terminates after the execution of the main process code is completed

Second: The daemon can no longer open the child process, or throw an exception: Assertionerror:daemonic processes is not allowed to has children

Note: Processes are independent of each other, the main process code is running and the daemon terminates

Process Synchronization (Lock)

Data between processes is not shared, but sharing the same set of file systems, so access to the same file, or the same print terminal, is no problem,

The result of competition is the disorder, how to control, is to lock processing

Part1: Multiple processes sharing the same print terminal

Part2: Multiple processes sharing the same file

Lock can ensure that multiple processes modify the same piece of data, only one task can be modified at the same time, that is, serial modification, yes, the speed is slow, but at the expense of speed to ensure the data security. Although you can use file sharing data for interprocess communication, the problem is: 1. Inefficient (shared data is file-based, and files are data on hard disks) 2. You need to lock yourself up to handle the summary: so it's best to find a solution that can take care of: 1, high efficiency (multiple processes share a piece of memory) 2 , help us handle the lock problem. This is the message-based IPC communication mechanism that the Mutiprocessing module provides for us: queues and pipelines. 1 queues and pipelines are stored in memory 2 queue is also based on (pipe + lock) Implementation, can let us out of the complex lock problem, we should try to avoid the use of shared data, as far as possible to use message delivery and queue, to avoid dealing with complex synchronization and locking problems, and when the number of processes increased, It is often possible to obtain better malleable.

Multi-process application of python--9 and concurrency

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.