Basic Primer _python-modules and packages. in-depth celery common structure/scheme selection/must know?

Source: Internet
Author: User
Tags connection pooling rabbitmq

Brief introduction:

Description: This module is an asynchronous task queue that focuses on distributed messaging, so-called tasks are messages, and the payload in the message contains all the data needed to perform the task


Several major features:

1. Celery is easy to use and maintain, and does not require configuration files, and the default configuration is automatically written to the message agent when it Starts.

2. Celery is highly available, the client or consumer will automatically retry when the connection is lost or failed, and can improve high availability through the message Agent's dual Master/master mode

3. Celery fast, Single process can handle million tasks per minute, and optimized to maintain round-trip delay at sub-millisecond level

4. Celery flexible, Almost all parts support extended or standalone use, connection pooling, serialization, compression mode, log, scheduler, consumer, producer, Auto extension, middleman transmission, etc.

5. Celery message agent perfectly supports RABBITMQ and Redis, other experimental support, resulting storage is perfectly supported Amqp/redis/memcached/mongodb/sqlalchemy/djangoorm/apache Cassandra, serialization perfectly supports pickle/json/yaml/msgpack/zlib/bzip2, and the concurrency mode perfectly supports Prefork/eventlet/gevent/worker.


Common Architecture:

650) this.width=650; "src=" http://s4.51cto.com/wyfs02/M01/8B/FF/wKioL1heiLrz-GmZAACRudrbDWc091.png "title=" Clipboard.png "alt=" wkiol1heilrz-gmzaacrudrbdwc091.png "/>

Description: the task producer, through the celery api, throws the task that needs to be performed into a message queue for the messaging agent, then gets the task execution from the message queue by the task consumer according to its own situation, can store them after the results are stored, and the task Publisher/task consumer runs separately to achieve the asynchronous Effect. however, It is important to note that the message agent does not belong to the celery component, and the official currently fully supports RABBITMQ and redis, but the individual strongly recommends rabbitmq.

Add: as producer can be either celery client or publisher, because celery is the asynchronous task queue for distributed messaging, applications can be flexibly deployed on a single (via import execution task Unit) or multiple (via app.send_ The task (func, (ARGS)) is implemented on the Host.


Solution Selection:

Description: in order to provide higher performance, it is highly recommended to choose RABBITMQ to do message agent, C library LIBRABBITMQ do py client interface, msgpack do client and consumer message serialization, Redis to do the result Storage.



Quick Installation:

Py2.6.x:pip install--upgrade "kombu==3.0.37" "celery[librabbitmq,redis,msgpack]==3.1.25" py2.7.x:pip install-- Upgrade "kombu==3.0.37" "celery[librabbitmq,redis,msgpack]==3.1.25"

Note: due to the fact that more than one version of Celery3.1.25 has been removed due to an official lack of funds, it is strongly recommended to use 3.1.25, a long-term stable support version of the cross-platform, which still supports py2.6.x and py2.7.x


Get Started Quickly:

#!/usr/bin/env python#-*-coding:utf-8-*-# @Date: 2016-12-24 16:27:01# @Author: li full ([email protected]) # @Link : http://xmdevops.blog.51cto.com/# @Version: $Id $from __future__ import absolute_import# description: import public module from celery import Ce lery# description: Import other modules app = celery (__name__, broker= ' amqp://root:[email protected]:5672//', backend= ' redis://10.2.5.51 : 5123/0 ', include=[],) @app. taskdef Add (x, y): return x + y

Description: celery application must be import, py all modules, so save as above code for app.py can, simple explanation, celery class of the first parameter is the current module name, and is necessary, broker specify the message agent, backend Specify the result store, Include specifies that multiple tasks perform file relative import locations, and then executes the celery worker-a app--loglevel=info in the terminal to start the consumer, and then launch a terminal in Pyshell to execute the From app import add;

Add.delay (4, 4), at which point the calling task asynchronously returns an asyncresult instance for checking the task status Result.ready ()/waiting for task completion result.get (timeout=1, propagate=false)/ Get the task return value, you can see the change in both terminal and result store separately, other high-end play can be obtained by celery--help.


You must know:

Description: the command line executes celery worker-a <app>--loglevel=info when,<app> must be imported, so it can be a py module or package, but it is important to note that both the package and the module must correctly specify the celery portal file If the absolute import name (app/work.app) of the default portal file name is celery.py for the package, celery gets the instantiated app through dynamic import, and then imports the Task-specific unit in the task execution file by importing the configuration specified at the time of the instantiation and the Include. Then wait for the task, you can see that celery is a relative/absolute import to find the definition of the task execution unit, the PY import will generate a PYC file, so the code must first delete the PYc File.





This article is from the "li-yun development road" blog, Please be sure to keep this source http://xmdevops.blog.51cto.com/11144840/1885857

Basic Primer _python-modules and packages. in-depth celery common structure/scheme selection/must know?

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.