Deep celery configuration with celery

Source: Internet
Author: User
Tags prefetch

Original: http://www.dongwm.com/archives/shi-yong-celeryzhi-shen-ru-celerypei-zhi/

Objective

Celery's official documents are actually relatively good. But in some deep-seated use it seems to be messy even without some aspect of the introduction, through a test environment of my settings.py to illustrate some of the use of celery skills and solutions

AMQP Interchange Type

In fact, there are 4 types of exchanges, as well as default types and custom types. But for us to configure the queue will only use three of them, I have to explain, English good words can go directly to read English documents

First think about the process:

    1. Celerybeat generates a task message and then sends a message to an exchange (switch)

    2. The switch decides that the queue will receive this message, which is actually based on the type of exchange below and the bindingkey used to bind to the switch

What we're going to say here is how to decide the second step who receives the question

    1. Direct Exchange

As its name, direct exchange, that is, specify a message to be received by that queue, this message is defined by celerybeat a routing key, if you send to the switch and that queue bound Bindingkey will be transferred directly to the queue

    1. Topic Exchange

You envision such an environment (I would like to use a small example of a scenario): You have three queues and three messages, a message may want to be processed by x, Y, b message you want to be, x,z processing, C message you want to be y,z processing. And this is not the queue but the message wants to be executed by the relevant queue, Looking at a picture may be better understood:

Yes, topic can be used in accordance with the same kind of properties process, you just need to routing key has '. ' Segmentation: For example, Usa.news, Usa.weather, Europe.news, Europe.weather

    1. Fanout Exchange

First think about the concept of broadcasting, in the assumption that you have a task, quite time-consuming, but requires very high real-time, then you can need more than one server workers work together, each server burden of one part of it, but Celerybeat will only generate a task, Removed by a worker, so you need to get each server queue to receive this message. It is important to note that your fanout-type messages are generated in multiple copies, one for each queue, and not one for the number of messages sent to a single queue

My settings.py.

Here are just some of the parts related to celery:

ImportDjcelerydjcelery.setup_loader () Installed_apps= (    'Django.contrib.auth',    'Django.contrib.contenttypes',    'django.contrib.sessions',    'django.contrib.sites',    #' Django.contrib.staticfiles ',    'django.contrib.messages',    #uncomment the next line to enable the admin:    'Django.contrib.admin',    'Django.contrib.staticfiles',    #uncomment the next line to enable admin documentation:    #' Django.contrib.admindocs ',    'Dongwm.smhome',    'dongwm.apply',    'Djcelery',#This adds djcelery to the Django Admin to directly configure and view the celery    'django_extensions',    'Djsupervisor',    'django.contrib.humanize',    'Django_jenkins') Broker_url='Amqp://username:[email protected]:5672/yourvhost'Celery_imports= (    'Dongwm.smhome.tasks',    'Dongwm.smdata.tasks',) Celery_result_backend="AMQP" #It is also recommended to use the LIBRABBITMQ of C in the site optimizationCelery_task_result_expires = 1200#celery The time-out of the task execution result, my task does not need to return the result, only need to execute correctly on the lineCeleryd_concurrency = 50#celery worker Concurrency number is also the command line-c specified number, in fact, practice found not the worker is more good, to ensure that the task does not accumulate, plus a certain new task reservation can beCeleryd_prefetch_multiplier = 4#celery worker each time to go to RABBITMQ to fetch the number of tasks, I here prefetch 4 slowly execute, because the task has long and short did not prefetch too muchCeleryd_max_tasks_per_child = 40#How many tasks per worker will die, I suggest a larger number, such asCelerybeat_scheduler ='Djcelery.schedulers.DatabaseScheduler' #this is using the Django-celery default database scheduling model, where the task execution cycle is present in the ORM database you specifyCelery_default_queue ="Default_dongwm" #default queue, if one message does not match the other queues will be placed in the default queuecelery_queues= {    "Default_dongwm": {#This is the default queue specified above        "Exchange":"Default_dongwm",        "Exchange_type":"Direct",        "Routing_key":"Default_dongwm"    },    "Topicqueue": {#This is a topic queue. The routing key that topictest begins will be placed in this queue.        "Routing_key":"topictest.#",        "Exchange":"Topic_exchange",        "Exchange_type":"Topic",    },    "test2": {#test and Test2 are 2 fanout queues, note that their exchange is the same        "Exchange":"Broadcast_tasks",        "Exchange_type":"fanout",        "Binding_key":"Broadcast_tasks",    },    "Test": {        "Exchange":"Broadcast_tasks",        "Exchange_type":"fanout",        "Binding_key":"Broadcast_tasks2",    },}classMyrouter (object):defRoute_for_task (self, task, Args=none, kwargs=None):ifTask.startswith ('topictest'):            return {                'Queue':'Topicqueue',            }        #There are 2 tasks in my dongwm.tasks file that are the beginning of test.        elifTask.startswith ('dongwm.tasks.test'):            return {                "Exchange":"Broadcast_tasks",            }        #the rest will actually be placed in the default queue.        Else:            returnNone#Celery_routes can also use a large dictionary with multiple dictionaries, but it is better to make a name for it directly wildcardCelery_routes = (Myrouter (),)

Deep celery configuration using celery (RPM)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.