Install using asynchronous task Queue package celery in Python environment

Source: Internet
Author: User
Tags rabbitmq
1. Introduction

Celery (celery) is an asynchronous task queue/job queue based on distributed messaging. It focuses on real-time operations, but is also good for scheduling support.
Celery is used in production systems to handle millions of tasks per day.
Celery is written in Python, but the protocol can be implemented in any language. It can also be implemented with other languages through Webhooks.
The recommended message agent is RABBITMQ, but offers limited support for Redis, Beanstalk, MongoDB, CouchDB, and database (using SQLAlchemy or Django ORM).
Celery is easy to integrate with Django, pylons and Flask, using Django-celery, Celery-pylons and flask-celery add-ons.

2. Installation
With the above concept, you need to install a few things: RabbitMQ, SQLAlchemy, celery
The installation method is also simple: RabbitMQ:
Under Mac:

Brew Install RABBITMQ

Linux:

sudo apt-get install Rabbitmq-server

The remaining two are Python's stuff, direct PIP installation is good, for never installed MySQL driver may need to install Mysql-python.
After the installation is complete, start the service:

$ rabbitmq-server[Carriage Return]

Do not close the window after startup, the following Action New window (TAB)

3. Simple case
Make sure your previous RABBITMQ has started.
or the official website of the example, in any directory to create a new tasks.py file, the content is as follows:

From celery import Celeryapp = celery (' tasks ', broker= ' amqp://guest@localhost//') @app. Taskdef Add (x, y):  return x + y

Executing in the sibling directory:

$ celery-a Tasks worker--loglevel=info

The command means to start a worker and put the task (add (x, y) in tasks into the queue.
Keep the window open and open a new window into interactive mode, Python or Ipython:

>>> from Tasks Import add>>> Add.delay (4, 4)

So far, you can already use celery to perform tasks, and the Python interactive mode above simply invokes the add task and passes the parameters.
But then there is a problem, you suddenly want to know the implementation of the task and status, the end is not finished. Therefore, it is necessary to set the backend.
The code in the previous tasks.py was modified as:

# Coding:utf-8import subprocessfrom time import sleepfrom celery import celerybackend = ' db+mysql://root:@192.168.0.102/ Celery ' broker = ' amqp://guest@192.168.0.102:5672 ' app = celery (' Tasks ', Backend=backend, Broker=broker) @app. taskdef Add (x, y):  sleep (Ten)  return x + y@app.taskdef hostname ():  return subprocess.check_output ([' hostname '])

In addition to adding backend, a who method is added to test multi-server operations. After the modification is completed, it is started as before.
Also enter the Python interaction model:

>>> from Tasks Import Add, hostname>>> r = Add.delay (4, 4) >>> R.ready () # 10s will output false because the add In sleep 10s>>>>>> r = Hostname.delay () >>> R.result # OUTPUT your hostname

4. Testing multiple servers
After doing the above test, there is a doubt that celery is called distributed task management, where is it distributed? How does it work? On which machine is it executed?
If the celery service on the current server does not shut down, install celery on the other server in the same way, and start:

$ celery-a Tasks worker--loglevel=info

Discover the celery service of the previous server to output the hostname of the server you just started, provided that the server is connected to your RABBITMQ.
Then enter Python interactive mode:

>>> from tasks Import hostname>>>>>>-I in range: ...   R = Hostname.delay ()   ... Print R.result # output of your hostname>>>

Look at what you entered has been observed on both servers you started the output of the celery service.

5. RABBITMQ Remote Connection issues
The remote server was unable to connect to the local RABBITMQ service at the beginning of the test, and later found it needed to set permissions, and in/usr/local/etc/rabbitmq/rabbitmq-env.conf this file, modify the node_ip_address= The IP in 127.0.0.1 is 0.0.0.0.

6. The summary says
This article simply introduces the use of celery, with a focus on distributed use. What's not cool is that when you scale up, you need to re-deploy the code (tasks.py) again instead of just sharing tasks, maybe celery a different worker match through a task? I don't know much about it at the moment, so I'll talk about it later.


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.