Multiple queues:
1. celery The default queue name for celery is celery on the direct switch and is available through Celery_default_queue/celery_default_ Exchange/celery_default_routing_key and other parameters configuration modification.
#!/usr/bin/env python# -*- coding: utf-8 -*-# @Date : 2016-12-24 17:32:54# @Author : li ([email protected]) # @Link : http://xmdevops.blog.51cto.com/# @Version : $Id $from __ future__ import absolute_import# Description: Import public module from kombu import queue, exchange# Description: Import other modules # broker_url = ' amqp://root:[email protected]:5672//' # celery_result_backend = ' redis://172.24.10.1:6379/0 ' broker_url = ' amqp://root:[email protected]:5672//' celery_result_backend = ' redis://10.2.5.51:5123/0 ' CELERY_TASK_SERIALIZER = ' msgpack ' celery_result_serializer = ' json ' celery_task_result_expires = 60 * 60 * 24celery_accept_content = [' json ', ' msgpack ']celeryd_max_tasks_per_child = 40celery_queues = ( queue ( name= ' email_queue '), exchange=exchange (' Email_exchange ', ' direct '), routing_key= ' email '), queue ( name= ' Wixin_queue ', Exchange=exchange (' Wixin_exchange ', ' direct '), routing_key = ' wixin '), celery_routes = { ' Work.notify.email.send_mail ': { ' queue ': ' Email_queue ', ' routing_key ': ' email ' }
Extension: Multiple queues mainly to resolve because of the default child process pool size limit, when one of the task message volume is particularly large, then other important messages may be delayed processing, resulting in a serious impact on the user experience, Depending on the business situation, different tasks can be placed in different queues and the number of different sub-processes is specified to improve overall response, as defined on the email_queue of the direct switch bound to email_exchange and bound to Wixin_ Wixin_queue of the direct switch for Exchange
Question: The queue should be binding to the switch, why the parameters given is routing_key very strange ...?
Note: Starting these two queues in a worker process is not recommended (celery worker-a work.app-l info), it is strongly recommended to run the separate process (celery worker-a work.app-c 2-q Email_queu E-l info;celery worker-a work.app-c 4-q wixin_queue-l info), separate working process runs, the process does not affect each other ~
This article is from the "Li-Yun Development Road" blog, please be sure to keep this source http://xmdevops.blog.51cto.com/11144840/1886364
Basic Getting Started _python-modules and packages. In-depth celery use queue and priority to improve response?