Django uses Celery asynchronous task queue, djangocelery

Source: Internet
Author: User
Tags rabbitmq install redis

Django uses Celery asynchronous task queue, djangocelery
1 Celery Introduction

Celery is an asynchronous task queue that runs independently of the main process. After the main process exits, it does not affect the execution of tasks in the queue.

After the task is restarted, it continues to execute other tasks in the queue and caches the tasks received during the stop. This function depends on Message Queue (MQ, Redis ).

 

1.1 Celery Principle

 

 

Celery'sArchitectureIt consists of three parts: message broker, worker, and task result store.

  • Message-oriented middleware: Celery itself does not provide message services, but it can be easily integrated with message-oriented middleware provided by third parties. Including,RabbitMQ,Redis, MongoDB (experimental), Amazon SQS (experimental), CouchDB (experimental), SQLAlchemy (experimental), Django ORM (experimental), IronMQ. We recommend that you use RabbitMQ and Redis as message queues.
  • Task execution unit: Worker is the unit of task execution provided by Celery. worker runs concurrently on distributed system nodes.
  • Task result storage: The Task result store stores the results of tasks executed by Worker. Celery supports storing Task results in different ways, including AMQP, Redis, memcached, MongoDB, SQLAlchemy, Django ORM, apache Cassandra and IronCache
1.2 Celery application scenarios
  • Asynchronous Task Processing: for example, sending a short message or confirming a mail task to a registered user.
  • Large tasks: tasks that take a long time, such as video and image processing, adding watermarks, and transcoding, must be executed for a long time.
  • Scheduled tasks: supports scheduled and scheduled tasks. For example, the performance pressure test is regularly executed.

 

2 Celery Development Environment preparation 2.1 environment preparation

Software name

Version Number

Description

Linux

Centos 6.5 (64bit)

Operating System

Python

3.5.2

 

Django

1.10

Web Framework

Celery

4.0.2

Asynchronous task queue

Redis

2.4

Message Queue

 

2.2 Celery Installation

Usage:

Celery depends on message queue. redis or rabbit must be installed for use.

Here we use Redis. Install the redis database:

sudo yum install redis

  

Start redis:

sudo service redis start

 

Install the celery Library

sudo pip install celery==4.0.2

 

3 Celery execute Task 3.1 write task separately

Create a task. py file

Note: The configuration is loaded when the initial Celery instance is used. redis is used as the message queue and storage task result.

 

 

Run celery:

$ celery -A task worker --loglevel=info

The following figure shows that celery runs successfully.

 

3.2 call a task

Directly open the python interactive command line

Run the following code:

 

You can view the task execution information in the celery window.

 

 

Task execution status monitoring and result retrieval:

 

 

 

3.3 Summary of task call Methods

There are two methods:

For delay and apply_async, the delay method is the simplified version of apply_async.

add.delay(2, 2)add.apply_async((2, 2))add.apply_async((2, 2), queue='lopri')

 

The delay method is a simplified version of apply_async.

The apply_async method can contain many configuration parameters, including the specified queue.

  • Queue specifies the Queue name. Different tasks can be assigned to different queues.

 

3.4 task status

Each task has three statuses:

PENDING -> STARTED -> SUCCESS

 

Task query status:

res.state

 

To query the task status.

 

 

4. Integration with Django

The above briefly introduces the basic method of celery asynchronous tasks. In combination with our actual application, we need to use it with Django. The following describes how to use it with Django.

4.1 integration with Django

There are two ways to integrate with Django:

  • Django 1.8 or later: integrated with Celery 4.0
  • Django version earlier than 1.8: integrated with Celery3.1, using the django-celery Library

 

Today, we will introduce the integration methods of celery4.0 and django 1.8 and later versions.

4.2 create a project file

Create a project named proj

- proj/  - proj/__init__.py  - proj/settings.py  - proj/urls.py  - proj/wsgi.py- manage.py

 

Create a new file:Proj/mycelery. py

from __future__ import absolute_import, unicode_literalsimport osfrom celery import Celery # set the default Django settings module for the 'celery' program.os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings') app = Celery('proj') # Using a string here means the worker don't have to serialize# the configuration object to child processes.# - namespace='CELERY' means all celery-related configuration keys#   should have a `CELERY_` prefix.app.config_from_object('django.conf:settings', namespace='CELERY') # Load task modules from all registered Django app configs.app.autodiscover_tasks()

 

Add

from __future__ import absolute_import, unicode_literals # This will make sure the app is always imported when# Django starts so that shared_task will use this app.from .mycelery import app as celery_app __all__ = ['celery_app']

 

4.3 configure Celery

The mycelery. py file indicates that the celery configuration file is in settings. py and starts with CELERY.

   
app.config_from_object('django.conf:settings', namespace='CELERY')

 

Add the celery configuration in the settings. py file:

 

 

Our configuration is to use redis as the message queue. The message proxy and result are all in redis, And the serialization of the task is in json format.

Important: redis: // 127.0.0.1: 6379/0 this description uses the redis queue No. 0. If multiple celery tasks use the same queue, it will cause task confusion. It is recommended that the celery instance use a separate queue.

4.4 create an APP

Create a Django App named celery_task and create the tasks. py file in the app directory.

After completion, the directory structure is:

├── celery_task│   ├── admin.py│   ├── apps.py│   ├── __init__.py│   ├── migrations│   │   └── __init__.py│   ├── models.py│   ├── tasks.py│   ├── tests.py│   └── views.py├── db.sqlite3├── manage.py├── proj│   ├── celery.py│   ├── __init__.py│   ├── settings.py│   ├── urls.py│   └── wsgi.py└── templates

 

4.5 compile a task

Edit task File

tasks.py

Add the following code to the tasks. py file:

# Create your tasks herefrom __future__ import absolute_import, unicode_literalsfrom celery import shared_task
@shared_taskdef add(x, y): return x + y @shared_taskdef mul(x, y): return x * y @shared_taskdef xsum(numbers): return sum(numbers)

 

Start celery:

celery -A proj.mycelery worker -l info

 

Note: projIs the module name, myceleryIs celery.

Print successfully started:

 

 

4.6 call a task in views

Write interfaces in views to implement two functions:

  • Trigger the task and return the result and task ID of the task.
  • Query Task status by task ID

The Code is as follows:

 

 

Start django.

Start celery for a new session. The startup command is:

celery –A proj.mycelery worker –l info

 

Access http: // 127.0.0.1: 8000/add to view the returned results.

 

 

On the celery running page, you can see the following output:

 

 

4.7 query the task status in views

Sometimes the task execution takes a long time and you need to check whether the task is completed. You can query the task status based on the task id and perform the next operation based on the status.

The task status is SUCCESS.

 

 

5 Celery scheduled task

As an asynchronous task queue, We can regularly execute some tasks according to the time we set, such as daily database backup and log transfer.

Celery's scheduled task configuration is very simple:

The configuration of the scheduled task is still in the setting. py file.

Note: If you think celeryData configuration file and DjangoIn setting. pyA file is not convenient and can be split up, just in mycelery. py.

app.config_from_object('django.conf:yoursettingsfile', namespace='CELERY')

 

 

5.1 run tasks at intervals
# Call a task every 30 seconds. addfrom datetime import timedeltaCELERY_BEAT_SCHEDULE = {'add-every-30-seconds ': {'task': 'tasks. add', 'schedule': timedelta (seconds = 30), 'args': (16, 16 )},}

 

5.2 scheduled execution

Scheduled to run at every morning.

Note: Pay attention to the time format when setting the task time, in UTC or local time.

# Crontab task # Call the task at every day. addfrom celery. schedules import crontabCELERY_BEAT_SCHEDULE = {# Executes every Monday morning at a.m' add-every-Monday-morning ': {'task': 'tasks. add', 'schedule': crontab (hour = 7, minute = 30), 'args': (16, 16 ),},}

 

5.3 Start a scheduled task

A scheduled task is configured. In addition to the worker process, you also need to start a beat process.

The Beat process is equivalent to a scheduled task, which is executed according to the configuration.

5.3.1 start the beat Process

The command is as follows:

celery -A proj.mycelery beat -l info

 

 

5.3.2 start the worker Process

The start of a Worker process is the same as that of the previous startup command.

 

celery –A proj.mycelery worker –l info

 

6 Celery in-depth

Celery tasks support a variety of running modes:

  • Supports dynamically specifying the number of concurrent threads -- autoscale = 10, 3 (always keep 3 processes, but grow to 10 if necessary ).
  • Support for chained tasks
  • Supports Group tasks
  • Different task priorities are supported.
  • Supports specified task queues
  • Support running worker in eventlet Mode

For example, the specified concurrency is 1000.

celery -A proj.mycelery worker -c 1000

 

You can learn this information based on your usage.

 

 

7 References

Celery Official Website:

Http://docs.celeryproject.org/en/latest/index.html

Celery and Django:

Http://docs.celeryproject.org/en/latest/getting-started/next-steps.html#next-steps

Celery scheduled task:

Http://blog.csdn.net/sicofield/article/details/50937338

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.