Celery
Objective:
Celery is a python-developed distributed asynchronous message Task Queue that allows you to easily implement asynchronous processing of tasks, and if you need to use asynchronous tasks in your business scenario, consider using celery. Examples are available in several instance scenarios:
- You want to execute a batch command on 100 machines, it may take a long time, but you do not want your program to wait for the results to return, but to return a task ID, you can only take this task ID for a period of time to get the task execution results, when the task execution ing, you can continue to do other things.
- You want to do a timed task, such as every day to test all of your customer's information, if found today is the customer's birthday, send him a message blessing
Celery has the following advantages:
- Simple: Once you are familiar with the celery workflow, configuration and use are relatively simple
- High availability: Celery automatically attempts to re-execute a task when the task execution fails or a connection break occurs during execution
- Fast: A single-process celery can handle millions of tasks per minute
- Flexible: Almost celery components can be expanded and customized
Celery basic Work Flow chart:
1, celery installation and use
Celery needs to run in a Linux environment:
# install [[email protected] celerys]# pip3 install celery# Enter Python Import no exception indicates successful installation [[email protected] celerys]# Python3>> ;> Import Celery
The default broker for celery is RABBITMQ, only one line is required to
Broker_url = ' Amqp://guest:[email protected]:5672//'
You can also use Redis as a broker.
Broker_url = ' redis://localhost:6379/0 ' #redis://:[email protected]:p Ort/db_number
2. Easy to use
Creating a task file is called tasks.py:
From celery import celeryimport time app = celery (' cly ', # any broker= ' redis://192.168.1.166:6379/0 ', # Middleware C7/>backend= ' Redis://localhost ') # datastore @app. Taskdef Add (x, y): time.sleep print ("Running ...", X, y ) return X+y
Start celery worker to start listening and perform tasks:
# Add environment variables [[email protected] ~]# path= $PATH:/usr/local/python3.5/bin/# start a worker[[email protected] celerys]# celery-a Tasks worker--loglevel=info
Invoke Task:
[[email protected] celerys]# python3 Python 3.5.2 (default, Jul 7, 23:36:01) [GCC 4.8.5 20150623 (Red Hat 4.8.5 -11)] on Linuxtype ' help ', ' copyright ', ' credits ' or ' license ' for more information.>>> from tasks import add # import add>>> Add.delay (4,6) # Execute function <asyncresult:4b5a8ab6-693c-4ce5- B779-305cfcdf70cd> # returns taskid>>> result = Add.delay (4,6) # Execute function >>> result.get () # Sync get results, wait 10>>> result.get (timeout=1) # Set timeout time, expiration error exception Traceback (most Recent call last):--strip--celery.exceptions.timeouterror:the operation timed out.>>> result = Add.delay (4, ' A ') # Execute Error command >>> result.get () # Get to error message after GET, trigger exception Traceback (most recent C All last):--strip--celery.backends.base.typeerror:unsupported operand type (s) for +: ' int ' and ' str ' >>> res Ult = Add.delay (4, ' a ') >>> Result.get (propagate=false) # Propagate=false does not trigger an exception, gets an error message TypeError ("Unsupported operand type (s) fo R +: ' int ' and ' str ', ') >>> result.traceback # Get specific error message log printing with ' Traceback (most recent call last): \ n File "/usr/local/python3.5/lib/python3.5/site-packages/celery/app/trace.py", line 367, in trace_task\n R = RE Tval = Fun (*args, **kwargs) \ n File "/usr/local/python3.5/lib/python3.5/site-packages/celery/app/trace.py", line 622, In __protected_call__\n return Self.run (*args, **kwargs) \ n File "/data/celerys/tasks.py", line A, in add\n return x+y\ntypeerror:unsupported operand type (s) for +: \ ' int\ ' and \ ' str\ ' \ n '
Information received by the worker at this time:
[2017-07-08 03:12:22,565:warning/poolworker-1] running ... # Get to task [2017-07-08 03:12:22,565:warning/poolworker-1] 4[2017-07-08 03:12:22,565:warning/poolworker-1] 6 # Task execution complete data is stored to the backend side [2017-07-08 03:12:22,567:info/poolworker-1] Task tasks.add[ 683E395E-48B9-4D32-B3BB-1492C62AF393] succeeded in 10.01260852499945s:10
View worker (i.e. 192.168.1.166) end data:
[[email protected] redis-3.0.6]# src/redis-cli 127.0.0.1:6379> keys *] "_kombu.binding.celeryev" 2) "Unacked_mutex" 3) "_kombu.binding.celery.pidbox" 4) "_kombu.binding.celery"
After execution, the backend end of the data:
[[email protected] redis-3.0.6]# SRC/REDIS-CLI # After the program get, the data is not deleted 127.0.0.1:6379> keys) " celery-task-meta-683e395e-48b9-4d32-b3bb-1492c62af393 "
Python development "modules": Celery distributed asynchronous message task queue