1.Celery Module Invocation
Since celery is a distributed task scheduling module, how is celery linked to a distributed one, celery can support multiple computers to perform different tasks or the same task.
If you want to say celery distributed applications, it is necessary to mention the celery message routing mechanism, the AMQP protocol. Specifically, you can view the documentation for AMQP. Simply put, there can be multiple message queues (Messages queue), and different messages can be specified to be sent to different message queue, which is implemented through exchange. When sending messages to the message queue, you can specify routiing_key,exchange to route messages (routes) to different message queues through Routing_key.
Multi-worker, multi-queue, instance:
1. Write the file tasks.py on the server. You first define a celery object, and then you set the Celery object by celeryconfig.py. After that, three tasks were defined, namely Taska, TASKB, and Add.
#!/usr/bin/env#-*-conding:utf-8-*-from celery import Celery,platformsplatforms.C_FORCE_ROOT = Trueapp = Celery()app.config_from_object("celeryconfig")@app.taskdef tashA(x,y):return x*y@app.taskdef taskB(x,y,z):return x+y+z@app.taskdef add(x,y):return x+y
2. Write the celeryconfig.py file.
#!/usr/bin/env python#-*- coding:utf-8 -*-from kombu import Exchange,Queuefrom celery import platformsplatforms.C_FORCE_ROOT = TrueBROKER_URL = "redis://localhost:6379/7" CELERY_RESULT_BACKEND = "redis://localhost:6379/8"CELERY_QUEUES = (Queue("default",Exchange("default"),routing_key="default"),Queue("for_task_A",Exchange("for_task_A"),routing_key="for_task_A"),Queue("for_task_B",Exchange("for_task_B"),routing_key="for_task_B") )CELERY_ROUTES = {‘tasks.taskA‘:{"queue":"for_task_A","routing_key":"for_task_A"},‘tasks.taskB‘:{"queue":"for_task_B","routing_key":"for_task_B"}}
3. Start worker to specify task
Celery-a tasks worker-l info-n workera.%h-q for_task_a
Celery-a tasks worker-l info-n workerb.%h-q For_task_b
4. Incoming parameters
Export the above two files to Pycharm:
To write file parameters:
from tasks import *re1 = taskA.delay(100, 200)re2 = taskB.delay(1,2, 3)
print(re3.status) #查看re3的状态print(re3.id) #查看re3的id
Visible after running: The TASKA,TASKB has been executed normally.
5. We can see that the status of Add (Re3) is pending, which means no execution, This is because no celeryconfig.py file specifies which queue to change the route to, so it will be launched into the default queue of name celery, but we have not started the worker to perform tasks in celery. Next, let's start a worker to perform the tasks in the celery queue.
Celery-a tasks worker-l info-n worker.%h-q celery
So we run pycharm again and we see that add is also running and that ID is in the Redis database.
2.Celery and Scheduled Tasks
1. It is very simple to perform a timed task in celery, just set the Celerybeat_schedule property in the Celery object.
Next we add the Celerybeat_schedule variable in the celeryconfig.py:
CELERY_TIMEZONE = ‘UTC‘CELERYBEAT_SCHEDULE = { ‘taskA_schedule‘ : { ‘task‘:‘tasks.taskA‘, ‘schedule‘:20, ‘args‘:(5,6) }, ‘taskB_scheduler‘ : { ‘task‘:"tasks.taskB", "schedule":200, "args":(10,20,30) }, ‘add_schedule‘: { "task":"tasks.add", "schedule":10, "args":(1,2) }}
2.Celery Start a timed task
Celery-a tasks worker-l info-n workera.%h-q for_task_a-b
When the boot is complete:
Taska run once every 20 seconds Taska.delay (5, 6)
TASKB executes once every 200 seconds Taskb.delay (10, 20, 30)
Celery run once every 10 seconds add.delay (1, 2)
Python celery multi-work multi-queue