Python development "modules": Celery distributed asynchronous message task queue

Source: Internet
Author: User
Tags unsupported python decorator

Objective:

Celery is a python-developed distributed asynchronous message Task Queue that allows you to easily implement asynchronous processing of tasks, and if you need to use asynchronous tasks in your business scenario, consider using celery. Examples are available in several instance scenarios:

    • You want to execute a batch command on 100 machines, it may take a long time, but you do not want your program to wait for the results to return, but to return a task ID, you can only take this task ID for a period of time to get the task execution results, when the task execution ing, you can continue to do other things.
    • You want to do a timed task, such as every day to test all of your customer's information, if found today is the customer's birthday, send him a message blessing

Celery has the following advantages:

    • Simple: Once you are familiar with the celery workflow, configuration and use are relatively simple
    • High availability: Celery automatically attempts to re-execute a task when the task execution fails or a connection break occurs during execution
    • Fast: A single-process celery can handle millions of tasks per minute
    • Flexible: Almost celery components can be expanded and customized

Celery basic Work Flow chart:

1, celery installation and use

Celery needs to run in a Linux environment:

123456 # 安装[[email protected] celerys]# pip3 install celery# 进入python import无异常表示安装成功[[email protected] celerys]# python3>>> importcelery

The default broker for celery is RABBITMQ, only one line is required to

1 broker_url =‘amqp://guest:[email protected]:5672//‘

You can also use Redis as a broker.

12 broker_url =‘redis://localhost:6379/0‘#redis://:[email protected]:port/db_number

2. Easy to use

Creating a task file is called tasks.py:

123456789101112 from celery importCeleryimporttimeapp =Celery(‘cly‘,                                        # 任意             broker=‘redis://192.168.1.166:6379/0‘,        # 中间件             backend=‘redis://localhost‘)                  # 数据存储 @app.taskdefadd(x,y):    time.sleep(10)    print("running...",x,y)    return x+y

Start celery worker to start listening and perform tasks:

12345 # 加入环境变量[[email protected] ~]# PATH=$PATH:/usr/local/python3.5/bin/# 启动一个worker[[email protected] celerys]# celery -A tasks worker --loglevel=info

Invoke Task:

1234567891011121314151617181920212223242526 [[email protected] celerys]# python3Python 3.5.2(default, Jul  7201723:36:01)[GCC 4.8.5 20150623(Red Hat 4.8.5-11)] on linuxType"help""copyright""credits"or"license" formore information.>>> fromtasks importadd                   # import add>>> add.delay(4,6)                          # 执行函数<AsyncResult: 4b5a8ab6-693c-4ce5-b779-305cfcdf70cd>   # 返回taskid>>> result =add.delay(4,6)                 # 执行函数>>> result.get()                            # 同步获取结果,一直等待10>>> result.get(timeout=1)                   # 设置超时时间,过期错误异常Traceback (most recent call last):    --strip--celery.exceptions.TimeoutError: The operation timed out.>>> result = add.delay(4,‘a‘)               # 执行错误命令>>> result.get()                            # get后获取到错误信息,触发异常Traceback (most recent call last):     --strip--celery.backends.base.TypeError: unsupported operand type(s) for+‘int‘and‘str‘>>> result =add.delay(4,‘a‘)>>> result.get(propagate=False)             # propagate=False 不触发异常,获取错误信息TypeError("unsupported operand type(s) for +: ‘int‘ and ‘str‘",)>>> result.traceback                        # 获取具体错误信息 log打印用‘Traceback (most recent call last):\n  File "/usr/local/python3.5/lib/python3.5/site-packages/celery/app/trace.py", line 367, in trace_task\n    R = retval = fun(*args, **kwargs)\n  File "/usr/local/python3.5/lib/python3.5/site-packages/celery/app/trace.py", line 622, in __protected_call__\n    return self.run(*args, **kwargs)\n  File "/data/celerys/tasks.py", line 12, in add\n    return x+y\nTypeError: unsupported operand type(s) for +: \‘int\‘ and \‘str\‘\n‘

Information received by the worker at this time:

1234 [2017-07-0803:12:22,565: WARNING/PoolWorker-1] running...     # 获取到任务[2017-07-0803:12:22,565: WARNING/PoolWorker-14[2017-07-0803:12:22,565: WARNING/PoolWorker-16 # 任务执行完毕数据存储到backend端[2017-07-0803:12:22,567: INFO/PoolWorker-1] Task tasks.add[683e395e-48b9-4d32-b3bb-1492c62af393] succeeded in10.01260852499945s10

To view broker (that is, 192.168.1.166) end data:

123456 [[email protected] redis-3.0.6]# src/redis-cli127.0.0.1:6379> keys *1"_kombu.binding.celeryev"2"unacked_mutex"3"_kombu.binding.celery.pidbox"4"_kombu.binding.celery"

After execution, the backend end of the data:

123 [[email protected] redis-3.0.6]# src/redis-cli   # 程序get后,数据未被删除127.0.0.1:6379> keys *1"celery-task-meta-683e395e-48b9-4d32-b3bb-1492c62af393"

Python development "modules": Celery distributed asynchronous message task queue

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.