A tutorial on deploying the Python celery Framework on Redhat Linux _python

Source: Internet
Author: User
Tags documentation message queue redis serialization sqlite unique id rabbitmq in python

Celery (celery) is a distributed task queue based on Python development. It supports the use of task queues to perform task scheduling on distributed machines/processes/threads.
Architecture Design

The celery architecture consists of three parts, message broker, the task Execution unit (worker), and the task execution result store (Mission Results Store).

1. Message Middleware

The celery itself does not provide messaging services, but can be easily integrated with the messaging middleware provided by third parties. Including, RABBITMQ, Redis, MongoDB (experimental), Amazon SQS (experimental), CouchDB (experimental), SQLAlchemy (experimental) , Django ORM (experimental), IRONMQ

2. Task Execution Module

A worker is a unit of the task execution provided by celery, and the worker runs concurrently in a distributed System node.

3. Task Result Storage

The task result store is used to store the results of the tasks performed by the worker, celery supports the results of storing tasks in different ways, including AMQP, redis,memcached, Mongodb,sqlalchemy, Django ORM, Apache Cassandra, Ironcache

In addition, celery also supports different methods of concurrency and serialization

1. Concurrent

Prefork, Eventlet, gevent, Threads/single threaded

2. Serialization of

Pickle, JSON, YAML, Msgpack. Zlib, bzip2 compression, cryptographic message signing, etc.

Install and run

The installation process for celery is somewhat complex, and the following installation process is based on the Linux version of my AWS EC2 installation process, and different system installation processes may vary. You can refer to the official documentation.

First I choose RABBITMQ as message middleware, so I have to install RABBITMQ first. As a setup, update the Yum first.

sudo yum-y update

RABBITMQ is based on Erlang, so install Erlang first

# ADD and enable relevant application repositories:
# Note:we are also enabling third party Remi package.
wget http://dl.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
wget http://rpms.famillecollet.com/enterprise/remi-release-6.rpm
sudo rpm-uvh remi-release-6*.rpm epel-release-6*.rpm

# Finally, download and install Erlang:
Yum Install-y Erlang

Then install RABBITMQ

# Download The latest RABBITMQ package using wget:
wget 
# ADD The necessary keys for verification:
rpm--impor T 
# Install the. RPM package using YUM:
YUM Install rabbitmq-server-3.2.2-1.noarch.rpm

Start the RABBITMQ service

Rabbitmq-server start
The RABBITMQ service is ready, then install celery, assuming you use PIP to manage your Python installation package
Pip Install celery


To test whether celery is working, we run one of the simplest tasks to write tasks.py

From celery import celery
 
app = celery (' tasks ', backend= ' amqp ', broker= ' amqp://guest@localhost//')
App.conf.CELERY_RESULT_BACKEND = ' db+sqlite:///results.sqlite '
 
@app. Task
def Add (x, y): return
 x + y

Run a worker in the current directory to perform this addition task

Celery-a Tasks worker--loglevel=info

Where-a parameter represents the name of the celery app. Notice that I'm using sqlalchemy as the result store here. The corresponding Python package should be installed beforehand.

We see this information in the worker log

-* * *----------[config]
-* *----------. > App:   TASKS:0X1E68D50
-* * *----------. > Transport:amqp://gu est:** @localhost: 5672//
-* *----------. > Results: Db+sqlite:///results.sqlite-* * *
---*---. > Concurrency:8 (Prefork)

Where we can see that the worker uses prefork to perform concurrency by default and sets the concurrency number to 8

The following task executes the client code:

From the Tasks import add import time result
= Add.delay (4,4) while not
 
Result.ready ():
 print "Not ready y ET "
 time.sleep (5)
 
print result.get ()

Execute this client code in Python, and on the client side, the results are as follows

Not ready 
8

Work Log Display

[2015-03-12 02:54:07,973:info/mainprocess] Received task:tasks.add[34c4210f-1bc5-420f-a421-1500361b914f]
[2015-03-12 02:54:08,006:info/mainprocess] Task TASKS.ADD[34C4210F-1BC5-420F-A421-1500361B914F] succeeded in 0.0309705100954s:8

Here we can find that each task has a unique id,task that executes asynchronously on the worker.

Notice here that if you run the example in the official document, you can't get the results on the client, which is why I use SQLAlchemy to store the results of the task execution. The official example uses AMPQ, and it is possible that when a worker prints a task's running results in the worker log, AMPQ as a message queue, when the message is removed, the queue is gone, so the client is always unable to get the result of the task. I don't know why the official documents are blind to such mistakes.

If you want to make a further understanding of celery, please refer to the official documentation

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.