Tutorial on implementing distributed processes in Python programs, python process tutorial

Source: Internet
Author: User

Tutorial on implementing distributed processes in Python programs, python process tutorial

Process should be preferred in Thread and Process because Process is more stable and can be distributed to multiple machines, while Thread can be distributed to multiple CPUs of the same machine at most.

The multiprocessing module of Python not only supports multi-process, but also supports multi-process distribution on multiple machines. A service process can be used as a scheduler to distribute tasks to multiple other processes and rely on network communication. Because the managers module is well encapsulated, you can easily write distributed multi-process programs without having to know the details of network communication.

For example, if we already have a multi-process program that uses Queue communications to run on the same machine, now, due to the heavy workload of processing the task, you want to distribute the processes for sending and processing tasks to two machines. How to use distributed processes?

The original Queue can continue to be used. However, by exposing the Queue through the network through the managers module, other machine processes can access the Queue.

Let's first look at the service process. The service process is responsible for starting the Queue, registering the Queue to the network, and then writing the task to the Queue:

# Taskmanager. pyimport random, time, Queuefrom multiprocessing. managers import BaseManager # Queue for sending tasks: task_queue = Queue. queue () # Queue for receiving results: result_queue = Queue. queue () # The QueueManager: class QueueManager (BaseManager) inherited from BaseManager: pass # registers both Queue to the network. The callable parameter is associated with the Queue object QueueManager. register ('get _ task_queue ', callable = lambda: task_queue) QueueManager. register ('get _ result_queue ', callable = lambda: result_queue) # bind port 5000 and set the verification code 'abc': manager = QueueManager (address = ('', 5000 ), authkey = 'abc') # Start Queue: manager. start () # obtain the Queue object accessed through the network: task = manager. get_task_queue () result = manager. get_result_queue () # put several tasks in: for I in range (10): n = random. randint (0, 10000) print ('put task % d... '% n) task. put (n) # Read the result from the result queue: print ('try get results... ') for I in range (10): r = result. get (timeout = 10) print ('result: % s' % r) # Close: manager. shutdown ()

Note that when we write a multi-process program on a machine, the created Queue can be used directly. However, in a distributed multi-process environment, adding a task to the Queue cannot directly perform operations on the original task_queue, which bypasses the QueueManager encapsulation and must pass the manager. added the Queue interface obtained by get_task_queue.

Then, start the task process on another machine (or on the local machine ):

# Taskworker. pyimport time, sys, Queuefrom multiprocessing. managers import BaseManager # create a similar QueueManager: class QueueManager (BaseManager): pass # because this QueueManager only obtains the Queue from the network, only the QueueManager name is provided during registration. register ('get _ task_queue ') QueueManager. register ('get _ result_queue ') # connect to the server, that is, run taskmanager. py machine: server_addr = '2017. 0.0.1 'print ('connect to server % s... '% server_addr) # maintain the port and verification code with taskmanager. py settings are exactly the same: m = QueueManager (address = (server_addr, 5000), authkey = 'abc') # slave network connection: m. connect () # obtain the Queue object: task = m. get_task_queue () result = m. get_result_queue () # obtain the task from the task queue and write the result to the result queue: for I in range (10): try: n = task. get (timeout = 1) print ('run task % d * % d... '% (n, n) r =' % d * % d = % d' % (n, n, n * n) time. sleep (1) result. put (r) Queue t Queue. empty: print ('Task queue is empty. ') # processing end: print ('worker exit. ')

To connect a task process to a service process through a network, you must specify the IP address of the service process.

Now, you can try the effects of distributed processes. Start the taskmanager. py service process first:

$ python taskmanager.py Put task 3411...Put task 1605...Put task 1398...Put task 4729...Put task 5300...Put task 7471...Put task 68...Put task 4219...Put task 339...Put task 7866...Try get results...

After the taskmanager process sends the task, it starts to wait for the result queue results. Start taskworker. py now:

$ python taskworker.py 127.0.0.1Connect to server 127.0.0.1...run task 3411 * 3411...run task 1605 * 1605...run task 1398 * 1398...run task 4729 * 4729...run task 5300 * 5300...run task 7471 * 7471...run task 68 * 68...run task 4219 * 4219...run task 339 * 339...run task 7866 * 7866...worker exit.

When the taskworker process ends, the result is printed in the taskmanager process:

Result: 3411 * 3411 = 11634921Result: 1605 * 1605 = 2576025Result: 1398 * 1398 = 1954404Result: 4729 * 4729 = 22363441Result: 5300 * 5300 = 28090000Result: 7471 * 7471 = 55815841Result: 68 * 68 = 4624Result: 4219 * 4219 = 17799961Result: 339 * 339 = 114921Result: 7866 * 7866 = 61873956

What is the use of this simple Manager/Worker model? In fact, this is a simple but real distributed computing. By slightly modifying the code and starting multiple workers, You can distribute tasks to several or even dozens of machines, for example, if you change the code for Calculating n * n to send mail, the mail queue can be asynchronously sent.

Where is the Queue object stored? Note that no Queue code is created in taskworker. py. Therefore, the Queue object is stored in the taskmanager. py process:

Queue can be accessed through the network through QueueManager. Since QueueManager manages more than one Queue, You need to name the network call interface of each Queue, such as get_task_queue.

What is the use of authkey? This is to ensure normal communication between the two machines and prevent malicious interference from other machines. If the authkey of taskworker. py is inconsistent with the authkey of taskmanager. py, the connection will definitely fail.
Summary

Python's distributed process interface is simple and well encapsulated. It is suitable for distributing heavy tasks to multiple machines.

Note that the function of Queue is to pass the task and receive the result. The description data volume of each task should be as small as possible. For example, if you want to send a job to process log files, do not send hundreds of megabytes of log files, but send the complete path of the log files to be stored. The Worker process then reads the files from the shared disk.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.