Python celery in-depth configuration practice tutorial

Source: Internet
Author: User

Celery's official documentation is actually quite well written. however, in some in-depth usage, it seems messy and even has no introduction to some aspects, through the settings of one of my test environments. py to illustrate some tips and solutions for using celery

Amqp exchange type

In fact, there are a total of four switching types, including the default type and custom type. However, we only use three of them to configure the queue. I will explain them one by one. If you are good at English, you can go directly to the English documentation.

First, consider the process:

Celerybeat generates a task message, and then sends the message to an exchange (Switch)

The switch determines which (some) queue will receive the message, which is based on the following exchange type and the bindingkey bound to this switch.

What we want to talk about here is how to decide who receives the second step.

Direct Exchange

If it is its name, direct exchange means that a message is received by the queue, and the message is defined as a routing key by celerybeat, if you send the bindingkey to the switch and bound to the queue, it is directly transferred to the queue.

Topic Exchange

Assume that you have three queues and three messages. A message may be processed by X or Y, messages B you want to be processed by X, Z, and messages C you want to be processed by Y and Z. in addition, this is not the difference between queues, but the message is expected to be executed by the relevant queues. It may be better understood to see a picture:

Yes, the Topic can be configured based on similar attribute processes. You only need to separate the routing key with '.', for example, usa. news, usa. weather, europe. news, and europe. weather in the preceding figure.

Fanout Exchange

First, let's look at the concept of broadcasting. Imagine that you have a task that is time-consuming but requires high real-time performance. Then you can need multiple workers of multiple servers to work together, each server is a part of the load, but celerybeat generates only one task and is removed by a worker. Therefore, you need to make the queue of each server receive this message. it is worth noting that your fanout type messages are generated in multiple copies, one for each queue, instead of the number of times a message is sent to a single queue.
My settings. py

Here is only the part related to celery:

The code is as follows: Copy code
Import djcelery
Djcelery. setup_loader ()

INSTALLED_APPS = (
'Django. contrib. Auth ',
'Django. contrib. contenttypes ',
'Django. contrib. Session ',
'Django. contrib. sites ',
# 'Django. contrib. Staticfiles ',
'Django. contrib. messages ',
# Uncomment the next line to enable the admin:
'Django. contrib. Admin ',
'Django. contrib. Staticfiles ',
# Uncomment the next line to enable admin documentation:
# 'Django. contrib. Admindocs ',
'Dongwm. Smhome ',
'Dongwm. Application ',
'Djcelery ', # Here, djcelery is added to configure and view celery directly in django admin.
'Django _ extension ',
'Djsupervisor ',
'Django. contrib. humanize ',
'Django _ jenkins'
)

BROKER_URL = 'amqp: // username: password @ localhost: 5672/yourvhost'

CELERY_IMPORTS = (
'Dongwm. smhome. tasks ',
'Dongwm. smdata. Task ',
)

CELERY_RESULT_BACKEND = "amqp" # librabbitmq of c is also recommended for optimization on the official website.
CELERY_TASK_RESULT_EXPIRES = 1200 # time-out period of the result of celery task execution. All my tasks do not need to return results, but they only need to be correctly executed.
CELERYD_CONCURRENCY = 50 # The number of concurrent jobs in celery worker is also the number specified by command line-c. In fact, it is found that there are not too many workers, so as to ensure that tasks are not accumulated, you can add a reservation for a new task.
CELERYD_PREFETCH_MULTIPLIER = 4 # The number of tasks that celery worker uses to fetch from rabbitmq each time. Here I prefetch four tasks and execute them slowly, because the task has a long, short, or too many prefetch tasks.
CELERYD_MAX_TASKS_PER_CHILD = 40 # The number of tasks executed by each worker will die. I suggest a larger number, such as 200.
Celerybeat_schedery = 'djcelery. schedulers. DatabaseScheduler '# This uses the default django-celery database scheduling model, and the task execution cycle is stored in the orm database you specified.
CELERY_DEFAULT_QUEUE = "default_dongwm" # default queue. If a message does not match other queues, it is placed in the default queue.

CELERY_QUEUES = {
"Default_dongwm": {# This is the default queue specified above
"Exchange": "default_dongwm ",
"Exchange_type": "direct ",
"Routing_key": "default_dongwm"
},
"Topicqueue": {# This is a topic queue. Any routing key starting with topictest will be put in this queue.
"Routing_key": "topictest .#",
"Exchange": "topic_exchange ",
"Exchange_type": "topic ",
},
"Test2": {# test and test2 are two fanout queues. Note that their exchange is the same.
"Exchange": "broadcast_tasks ",
"Exchange_type": "fanout ",
"Binding_key": "broadcast_tasks ",
},
"Test ":{
"Exchange": "broadcast_tasks ",
"Exchange_type": "fanout ",
"Binding_key": "broadcast_tasks2 ",
},
}

Class MyRouter (object ):

Def route_for_task (self, task, args = None, kwargs = None ):

If task. startswith ('topictest '):
Return {
'Queue ': 'topicqueue ',
            }
# In My dongwm. tasks file, two tasks start with test.
Elif task. startswith ('dongwm. tasks. Test '):
Return {
"Exchange": "broadcast_tasks ",
            }
# The rest will actually be placed in the default queue
Else:
Return None

# CELERY_ROUTES can also use a large dictionary containing multiple dictionaries, but it is better to directly configure it with a name
CELERY_ROUTES = (MyRouter (),)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.