Celery is a distributed task scheduling module, so how does celery and distributed hooks work?
Celery can support multiple different computers to perform different tasks or the same task.
If you want to say celery distributed applications, it is necessary to mention the celery message routing mechanism, referring to the AMQP protocol.
Specifically, you can view the AMQP documentation for more information.
Simple to understand:
There can be multiple message queues, and different messages can be specified to be sent to different message queue,
This is done through exchange, and when you send a message to Message Queuing, you can specify that Routiing_key,exchange pass the Routing_key message routing (routes) to a different message queue.
Exchange corresponds to a message queue (queue), which means that the exchange corresponds to the queue through the mechanism of "message routing", and each queue corresponds to each worker
Write an example:
Vim demon3.py
From celery import Celeryapp = celery () app.config_from_object ("Celeryconfig") @app. Taskdef Taska (x, y): return x * Y@app . Taskdef taskb (x, Y, z): return x + y + z@app.taskdef Add (x, y): return x + y
Vim celeryconfig.py
From kombu import queueborker_url = "REDIS://192.168.48.131:6379/1" #1库CELERY_RESULT_BACKEND = "redis:// 192.168.48.131:6379/2 "#2库CELERY_QUEUES = {Queue (" default ", Exchange (" Default "), Routing_key =" Default "), Queue (" fo R_task_a ", Exchange (" For_task_a "), Routing_key =" For_task_a "), Queue (" For_task_b ", Exchange (" For_task_b "), Routing_ Key = "For_task_b")} #路由CELERY_ROUTES = {"Demon3.taska": {"queue": "For_task_a", "Routing_key": "For_task_a"}, "demo N3.taskb ": {" queue ":" For_task_b "," Routing_key ":" For_task_b "}}
The following two scripts are imported into the server:
Specify Taska to start a worker:
# celery-a Demon3 worker-l info-n workera.%h-q for_task_a
Similarly:
# celery-a Demon3 worker-l info-n workerb.%h-q for_task_b
The following remote client calls: New file
Vim remote.py
From demon3 Import *r1 = Taska.delay (Ten) print (R1.result) print (r1.status) r2 = taskb.delay (1) PRN It (r2.result) print (r2.status) #print (dir (r2)) R3 = Add.delay (+) print (r3.result) print (r3.status) #PENDING
See the state is pending, indicating no execution, This is because no celeryconfig.py file specifies which queue to change the route to, so it will be launched into the default queue of name celery, but we have not started the worker to perform tasks in celery.
Next, let's start a worker to perform the tasks in the celery queue
# celery-a Tasks worker-l info-n worker.%h-q Celery # #默认的
You can see that the result of this line is success
Print (Re3.status) #SUCCESS
Scheduled tasks:
Celery and timed tasks
It is very simple to perform a timed task in celery, just set the Celerybeat_schedule property in the Celery object.
Next we go to the config file: celeryconfig.py, adding about the Celerybeat_schedule variable to the script:
Celery_timezone = ' UTC ' celerybeat_schedule = {' Taska_schedule ': {' task ': ' Tasks.taska ', ' SCHEDULE ': 20, ' args ':(5,6)}, ' Taskb_scheduler ': {' task ': "TASKS.TASKB", "Schedule": $, "args":(10,20,30)}, ' add_ Schedule ': {"task": "Tasks.add", "Schedule": Ten, "args":()}}
Note the format, or there will be a problem
Start:
Celery-a demon3 worker-l info-n workera.%h-q for_task_a
Celery-a demon3 worker-l info-n workerb.%h-q for_task_b
Celery-a tasks worker-l info-n worker.%h-q celery
Celery-a Demon3 Beat
Celery Python Multi-instance timer task