Asynchronous task artifact Celery concise notes

Source: Internet
Author: User
Tags rabbitmq

Transfer from: http://www.jianshu.com/p/1840035cb510 asynchronous task

Asynchronous tasks are a very common approach in web development. For some time-consuming and resource-intensive operations, they are often isolated from the main application and executed asynchronously. In short, to do a registration function, after the user has successfully registered with the mailbox, you need to send an activation email to the mailbox. If placed directly in the app, the process of calling the outgoing message encounters a blocking of the network IO, which is better than an elegant way of using an asynchronous task that triggers an asynchronous task in the business logic.

There are many tools for implementing asynchronous tasks, all of which use a task queue, such as using a Redis production consumption model or a publish-subscribe pattern to implement a simple message queue.

In addition to Redis, you can also use another artifact---celery. Celery is a dispatch tool for an asynchronous task. It is a library written by Python, but its implementation of the communication protocol can also use Ruby,php,javascript and other calls. The way an asynchronous task performs in addition to the background of Message Queuing, or a scheduled task that is a follow-up time. Here's how you can use celery to implement both of these requirements.

Celry Broker and Backend

When the earliest learning celery, a rabbitmq emerged, and a redis emerged. At that time confused. In fact, this is precisely the design of celery. Simply put, RABBITMQ is a powerful Message Queuing tool written in Erlang. Can play the role of broker in celery. So what is broker?

Broker is a message-transfer middleware that can be understood as a mailbox. Whenever an application invokes celery's asynchronous task, it passes the message to the broker, and then the celery worker takes the message and executes the program for the execution. Well, this mailbox can be seen as a message queue. Then what is backend, usually the program sends the message, the end is finished, may not know the other side when accepted. To do this, celery implements a backend that stores these messages and some of the messages and results that celery executes. For brokers, the official recommendation is RABBITMQ and Redis, as for backend, the database. For the sake of simplicity, we all use Redis.

Getting starting

Using celery contains three aspects, one is defining task functions, the other is running the celery service, and finally the call of the client application.

Create a filetasks.py

Enter the following code:

from celery import Celerybrokers = ‘redis://127.0.0.1:6379/5‘backend = ‘redis://127.0.0.1:6379/6‘app = Celery(‘tasks‘, broker=broker, backend=backend)@app.taskdef add(x, y): return x + y

The above code imports the celery and then creates the celery instance app, which specifies the task name (consistent with the file name) in the course of the strength session, tasks passing in the broker and backend. It then creates a task function add .

Start the celery service below

Run at the current command line terminal:

celery -A tasks worker  --loglevel=info

You will see a pair of outputs at this time. Including the tasks of registration.

How does the following client program call it? Open a command line and enter the Python environment

in [0]:from Tasks Import AddIn [1]: R = add.delay (2, 2) in [2]: Add.delay (2, 2) out[ 2]: <asyncresult: 6fdb0629- 4beb-4eb7-be47-f22be1395e1d>in [3]: R = Add.delay (3, 3) in [4]: R.rer.ready r.result R.revokein [4]: R.ready () out[4]: Truein [ 6]: R.resultout[6]: 6In [7]: R.get () out[7]: 6     

The log executed by celery can be seen on the celery command line:

[2015-09-20 21:37:06,086: INFO/MainProcess] Task proj.tasks.add[76beb980-0f55-4629-a4fb-4a1776428ea8] succeeded in 0.00089102005586s: 6

Open Backend Redis, and you can see the information celery executes.

The Add function, which is now called in the Python environment, actually calls this method in the application. Note that if you assign the return value to a variable, the original application is blocked and waits for the result of the asynchronous task to return. Therefore, in practice, you do not need to assign a value to the result.

Scheduled Tasks

The above use is a simple configuration and the following describes a more robust way to use celery. First create a Python package, celery service, and name it proj. The directory files are as follows:

?  proj  tree.├── __init__.py├── celery.py # 创建 celery 实例├── config.py # 配置文件└── tasks.py # 任务函数

First, celery.py.

#!/usr/bin/env python# -*- coding:utf-8 -*-from __future__ import absolute_importfrom celery import Celeryapp = Celery(‘proj‘, include=[‘proj.tasks‘])app.config_from_object(‘proj.config‘)if __name__ == ‘__main__‘: app.start()

This time the app was created and no broker and backend were specified directly. Instead, it is in the configuration file.

config.py

#!/usr/bin/env python# -*- coding:utf-8 -*-from __future__ import absolute_importCELERY_RESULT_BACKEND = ‘redis://127.0.0.1:6379/5‘BROKER_URL = ‘redis://127.0.0.1:6379/6‘

The rest is tasks.py.

#!/usr/bin/env python# -*- coding:utf-8 -*-from __future__ import absolute_importfrom proj.celery import app@app.taskdef add(x, y): return x + y

The use of the method is also very simple, in the proj of the same level directory execution celery:

-l info

Now the task is also very simple, directly in the client code call proj.tasks function.

Scheduler

A common requirement is to perform a task every once in a while. Configured as follows

config.py

#!/usr/bin/env python #-*-coding:utf-8-*-from __future__ import absolute_importcelery_result_backend =  ' redis:// 127.0.0.1:6379/5 ' Broker_url =  ' REDIS://127.0.0.1:6379/6 ' Celery_timezone =  ' Asia/shanghai ' from datetime import Timedeltacelerybeat_schedule = { ' add-every-30-seconds ': { Task ':  ' Proj.tasks.add ',  ' schedule ': Timedelta (Seconds=30),  ' args ': (16, 16)},             

Note The configuration file requires a time zone to be specified. This code indicates that the Add function is executed every 30 seconds.

Once the scheduler is used, the start celery needs to be added with the-b parameter

-l info
Crontab

Planning tasks can also be implemented with crontab, celery also have crontab mode. Modify config.py

#!/usr/bin/env python#-*-Coding:utf-8-*-from __future__ import absolute_importcelery_result_backend =  ' REDIS://127.0.0.1:6379/5 ' Broker_url =  redis:// 127.0.0.1:6379/6 ' Celery_timezone =  ' Asia/shanghai ' from Celery.schedules import crontabcelerybeat_schedule = {# Executes every Monday morning at the a.m.  ' add-every-monday-morning ': { ' Tasks.add ',  ' schedule ': crontab (hour=< Span class= "Hljs-number" >7, Minute=30, Day_of_week=1),  ' args ': (16, 16),},}   

All in all, the scheduler is more finely segmented and can be accurate to seconds. The crontab mode is needless to say. Of course celery also has more advanced usage, such as multiple machine usage, enabling multiple worker concurrency processing, and more.

    • Tornado Asynchronous Note (i)---asynchronous task-Pinterest-High Performance Server Tornadopython The Web framework is a myriad of things. Just as Glory belongs to Greece, greatness belongs to Rome. Python's elegance combined with WSGI's design allows the web framework interface to achieve unified. WSGI combines applications (application) and servers (server). Both Django and flask can deploy applications in conjunction with Gunicon. Unlike Django and flask, Tornado can be either a WSGI application or a WSGI service. Of course, the choice of tornado more considerations stems from its single-process one-thread asynchronous IO network mode. High performance is often attractive, but many friends use it.



Text/Human World (Jane book author)
Original link: http://www.jianshu.com/p/1840035cb510
Copyright belongs to the author, please contact the author to obtain authorization, and Mark "book author".

Asynchronous task artifact Celery concise notes

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.