Directory
Table of contents list preface celery periodic timing task celery synchronous call celery result storage celery monitoring celery debugging
List of the preceding text
Distributed task Queue celery
Distributed task Queue celery--detailed workflow Preface
Immediately before, continue to look at the celery application base, the following examples are still in the previous Proj modified. Celery Periodic (timed) tasks
The celery cycle task function is supported by the Beat Task Scheduler module, Beat is a service process that is responsible for periodically starting tasks defined in Beat_schedule.
e.g.
# filename:app_factory.py from __future__ import absolute_import to celery import celery from kombu import Queue, Excha Nge def Make_app (): app = celery (' proj ') app.config_from_object (' proj.celeryconfig ') Default_exchange = exc hange (' Default ', type= ' direct ') Web_exchange = Exchange (' Task ', type= ' direct ') App.conf.task_default_queue = ' Defa Ult ' App.conf.task_default_exchange = ' default ' App.conf.task_default_routing_key = ' default ' app.conf.task_q Ueues = (Queue (' Default ', Default_exchange, routing_key= ' default '), Queue (' High_queue ', Web_exchange, Rou ting_key= ' Hign_task '), Queue (' Low_queue ', Web_exchange, routing_key= ' Low_task '),) # Set Beat time zone, default to UTC District App.conf.timezone = ' Asia/shanghai ' # declare periodic tasks in Beat_schedule app.conf.beat_schedule = {# cycle task Frie
ndly Name ' Periodic_task_add ': {# Task full path ' task ': ' Proj.task.tasks.add ', # cycle time ' Schedule ': 3.0, # Specify required parameters for the task ' args ': (2, 2)}, return app
Use the-B option to start the Beat module while the Celery Worker service process is started.
Note 1: Beat will store the cycle task schedule in the Celerybeat-schedule file, which is generated in the current directory of the execution instruction. When timezone changes, Beat automatically adjusts the timing according to celerybeat-schedule content.
Note 2: Beat also supports crontab timing, very easy to use.
e.g.
# filename:app_factory.py from
celery.schedules import crontab ...
App.conf.beat_schedule = {'
periodic_task_add ': {'
task ': ' Proj.task.tasks.add ',
# every minute of the cycle executes
' Schedule ': crontab (minute= ' */1 '),
' args ': (2, 2)
},
}
synchronous calls to celery
The Task.get method, which is used to get the execution results of a task, can also be used to implement celery synchronous calls to accommodate more scenarios.
e.g.
# filename:tasks.py
Import time
proj.celery import app
@app. Task
def Add (x, Y, debug=false):
# Test sync invoke.
Time.sleep to I in
xrange (a):
print ("warting:%s S"% i)
if debug:
print ("x:%s; Y:%s% (x, y)) return
x + y
Synchronization Call Task Add
>>> from proj.task.tasks Import add
>>> Add.delay (2, 2). Get ()
4
Because the Get method is called directly, the process is blocked until the task add returns the result. Celery Result Storage
If you are very concerned about the results of the task execution, you can use the database (e.g). Redis) to act as a backend, thus persisting the execution results.
e.g.
# Perform a task and get the task ID
>>> from proj.task.tasks Import add
>>> results = Add.delay (2, 2)
> >> result.status
u ' SUCCESS '
>>> result.get ()
4
>>> result.id
' 65cee5e0-5f4f-4d2b-b52f-6904e7f2b6ab '
Enter the Redis database to view the records for the task.
root@aju-test-env:~# redis-cli
127.0.0.1:6379>
# view Redis all keys
127.0.0.1:6379> keys *
1) " Celery-task-meta-da3f6f3d-f977-4b39-a795-eaa89aca03ec "
2)" celery-task-meta-38437d5c-ebd8-442c-8605-435a48853085 "
...
) "Celery-task-meta-65cee5e0-5f4f-4d2b-b52f-6904e7f2b6ab" ...
# by task ID, you can locate the value of the task in Redis
127.0.0.1:6379> get ' celery-task-meta-65cee5e0-5f4f-4d2b-b52f-6904e7f2b6ab ' c10/> "{\" status\ ": \" Success\ ", \" traceback\ ": null, \" Result\ ": 4, \" task_id\ ": \" 65cee5e0-5f4f-4d2b-b52f-6904e7f2b6ab\ ", \" children\ ": []}"
celery surveillance.
celery Flower is celery official recommended monitoring tool, with the help of celery Events interface, Flower can real-time monitor celery Worker, Tasks, Broker, concurrent pool and other important objects. Install Flower
$ pip Install flower
Open Celery Events
Celery worker-a proj-e-L Info
Open RABBITMQ Management Plugin
$ rabbitmq-plugins Enable rabbitmq_management
$ service Rabbitmq-server restart
start Flower and specify the broker URL
>nbsp;celery flower-l Info--broker_api=http://guest:guest@<rabbitmq_server_ip>:15672/api/
accessing the Flower Web, the browser opens Http://<flower_server_ip>:5555/dashboard
Debugging of Celery
celery can support remote PDB debugging by using Telnet, which is very convenient.
# filename:tasks.py
from proj.celery Import app from
celery.contrib import RDB
@app. Task
def Add (x, y):
# Set Breakpoints
rdb.set_trace () return
x + y
Use the Celery.contrib Rdb to set breakpoints, and then restart the Celery Worker service.
You can see that the log indicates the address of Telnet remote connection, so open another terminal, perform a telnet command to complete the connection, and go to the very familiar PDB shell.