Introduction to the asynchronous task tool Celery (I) and the asynchronous task tool celery

Source: Internet
Author: User

Introduction to the asynchronous task tool Celery (I) and the asynchronous task tool celery

The django project has encountered some problems. After sending a request, the server needs to perform a series of very long-time operations, and the user has to wait for a long time. Can I immediately return a response to the user and then run those operations in the background?

Crontab timing tasks are difficult to meet such requirements. asynchronous tasks are a good solution. There is a very useful asynchronous task tool Celery written in python.

Broker, worker, and backend

The Celery architecture consists of three parts: Message Middleware (broker), task execution unit (worker), and task execution result storage (result backends.

When the application calls Celery, a message is sent to the broker, and then the worker receives the message and executes the program. The backend is used to store the message and the result of the Celery execution.

Message Middleware broker

Celery itself does not provide message service, but it can be easily integrated with the message middleware provided by a third party.

RabbitMQ is the best message-oriented middleware, which can be used as follows:

Using RabbitMQ

Redis is also feasible, although there is a risk of information loss:

Using Redis

Overview of other brokers.

Task execution unit worker

Worker is the unit of task execution provided by Celery. worker runs concurrently on distributed system nodes.

Task result storage backend

It is used to store the results of jobs executed by Worker: SQLAlchemy/Django ORM, Memcached, Redis, RPC (RabbitMQ/AMQP ).

Download

Download Celery is simple:

$ pip install celery

Here we use Redis as a broker for practice. We need additional database support and can download it together:

$ pip install -U "celery[redis]"
Write an application

Write a simple application tasks. py:

from celery import Celeryapp = Celery('tasks', broker='redis://localhost:6379/0')@app.taskdef add(x, y):    return x + y
Run worker

Run:

$ celery -A tasks worker --loglevel=info

The output is as follows:

[2017-09-10 06:59:58,665: INFO/MainProcess] Connected to redis://localhost:6379/0[2017-09-10 06:59:58,671: INFO/MainProcess] mingle: searching for neighbors[2017-09-10 06:59:59,688: INFO/MainProcess] mingle: all alone[2017-09-10 06:59:59,724: INFO/MainProcess] celery@ubuntu ready.
Send task

Enter the python environment:

>>> from tasks import add>>> add.delay(4, 4)

The message processed by the task is displayed in the worker:

[2017-09-10 07:02:34,874: INFO/MainProcess] Received task: task.add[40ec89c8-0a23-4a26-9da0-7f418c50f4cb]  [2017-09-10 07:02:34,876: INFO/ForkPoolWorker-1] Task task.add[40ec89c8-0a23-4a26-9da0-7f418c50f4cb] succeeded in 0.000579041981837s: 8
Storage results

Use Redis as the storage backend and modify it in tasks. py:

app = Celery('tasks', backend='redis://localhost:6379/0', broker='redis://localhost:6379/0')

You can view the result after running:

>>> from task import add>>> r=add.delay(3,4)>>> r.result7>>> r.ready()True>>> r.get(timeout=1)7

You can view the storage in Redis:

127.0.0.1:6379> get celery-task-meta-f2032d3e-f9a0-425d-bce4-f55ce58c8706"{\"status\": \"SUCCESS\", \"traceback\": null, \"result\": 7, \"task_id\": \"f2032d3e-f9a0-425d-bce4-f55ce58c8706\", \"children\": []}"127.0.0.1:6379> 

This is super simple. For more information about Celery, see the official documentation.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.