Celery Introduction:
Celery is an asynchronous task queue/job queue based on distributed message delivery. It focuses on real-time operations, but it is also good for scheduling support. Celery is written in Python, but the protocol can be implemented in any language. More profile Please search yourself on the web
The purpose of this article is to use Python to do asynchronous tasks with celery, install celery on CentOS 6.4, and supervisor to manage celery process, celery use Redis to do middleware message transmission. In reality, you can celery make asynchronous requests, such as sending mailboxes, sending messages, requesting URLs, and so on.
Environment deployment
Environment
System: CentOS 6.4 64-bit
Python version: 2.6.6
Create a related directory
Mkdir-p/data/{redis,logs,www}
(I'll put the Python script in the/data/www)
Install celery
Yum Install Python-devel Python-setuptools
Easy_install pip
Pip Install celery
Pip Install Redis # (Redis module in Python)
Install Supervisor
Pip Install Supervisor
echo_supervisord_conf >/etc/supervisord.conf
If in the Executive echo_supervisord_conf >/etc/supervisord.conf times pkg_resources. distributionnotfound:meld3>=0.6.5 wrong words, find supervisor-3.1.3-py2.6.egg-info/requires.txt, put the file inside Meld3 >= 0.6.5 comment out and then execute echo_supervisord_conf >/etc/supervisord.conf just fine.
Find Method:
Find/| grep requires.txt
Configuration
Vim/etc/supervisord.conf
Add the following parameters after configuration
[Program:celery]
Command=/usr/bin/celery worker-a Tasks
Directory=/data/www
Stdout_logfile=/data/logs/celery.log
Autostart=true
Autorestart=true
Redirect_stderr=true
Stopsignal=quit
Comments
;p Rogram:celery The name of the process to be managed, you define it yourself, I define it as celery
command is the order to start celery
;d Irectory is the program directory, because I want to start celery, need to enter the/data/www directory in order to be effective, so here in the start command, will switch to this directory
; Autorstart Automatic restart celery
; stdout_logfile Store celery log path
The approximate meaning of the above command is:
Proceed to the/data/www directory, then perform/usr/bin/celery worker-a tasks, and save the output log to/data/logs/ Celery.log, this is specified in the worker mode, if not specified, the default is Prefork mode, generally your machine has several cores, the system will open several worker processes, if there is an exception, remember to view the log/data/logs/celery.log
Install Redis
Cd/usr/local/src
wget http://download.redis.io/releases/redis-3.0.5.tar.gz
Tar XF redis-3.0.5.tar.gz
CD redis-3.0.5
Make
Make install
# (can be installed under the specified path with make Prefix=/usr/local/redis install)
CP Utils/redis_init_script/etc/init.d/redis
chmod A+x/etc/init.d/redis
Mkdir/etc/redis
CP redis.conf/etc/redis/
Simple Modify redis.conf (previous number is line number)
Daemonize Yes #进程转入后台运行
Port 22222 #修改端口 without the default 6379
#绑定IP bind 127.0.0.1, only local connection
Dir/data/redis #修改redis文件存放路径
396 Requirepass DILSI3_DFLKA_F3I_LFKDKDF!IDF #设置密码
453 MaxMemory 256M #redis最大使用256M内存, I am a virtual machine, so the memory settings are small
Comments:
Requirepass DILSI3_DFLKA_F3I_LFKDKDF!IDF is configured with the Redis password, which is required when the client connects and shuts down Redis
Modify/etc/init.d/redis (previous number is line number)
6 redisport=22222
7 Exec=/usr/local/bin/redis-server
8 CLIEXEC=/USR/LOCAL/BIN/REDIS-CLI
9 passwd= "DILSI3_DFLKA_F3I_LFKDKDF!IDF" #增加行
Ten Pidfile=/var/run/redis.pid
One conf= "/etc/redis/redis.conf"
$CLIEXEC-P $REDISPORT-a $PASSWD shutdown
Comments:
When you close Redis, you need to bring a redis password, or you will fail, and if you do not set a password, you do not need a password to close
Start Redis
Service Redis Start
View the Redis process
Ps-ef |grep Redis
Root 28631 1 0 04:30? 00:00:00/usr/local/bin/redis-server 127.0.0.1:22222
Root 28635 1542 0 04:31 pts/0 00:00:00 grep redis
Writing celery scripts,
Cd/data/www
Vim tasks.py
#!/usr/bin/env python
#coding: Utf-8
Import time
#导入celery相关模块, methods
From celery import celery
From celery import platforms
#因为supervisord默认是用root运行的, you must set the following parameters to true to allow celery to run with root
Platforms. C_force_root = True
#配置celery, the connection REDIS,DILSI3_DFLKA_F3I_LFKDKDF!IDF is Redis's password, 22222 is Port 10 is a library (Redis has 16 libraries, this takes 11th, that is 10)
config={}
config[' celery_broker_url ' = ' redis://:D ilsi3_dflka_f3i_lfkdkdf!idf@127.0.0.1:22222/10 '
config[' celery_result_backend ' = ' redis://:D ilsi3_dflka_f3i_lfkdkdf!idf@127.0.0.1:22222/10 '
#不需要返回任务状态, set the following argument to True
config[' celery_ignore_result '] = True
App = Celery ("Tasks", broker=config[' Celery_broker_url ')
App.conf.update (config)
@app. Task
def output (num):
"" "Output the number passed in, wait 2 seconds before the output" ""
Time.sleep (2)
Print num
Start Supervisord
/usr/bin/supervisord
See if success has started
[Root@drfdai www]# ps-ef |grep Supervisor
Root 28656 1 0 04:52? 00:00:00/usr/bin/python/usr/bin/supervisord
Root 28692 1542 0 04:53 pts/0 00:00:00 grep supervisor
Start celery:
Supervisorctl start celery
To see if celery is connected successfully redis
Tailf/data/logs/celery.log
can see [2015-11-16 05:05:27,938:warning/mainprocess] Celery@drfdai ready. Description Connection Succeeded
Test
To open the live log in Terminal "Terminal 1":
[Root@drfdai www]# Tailf/data/logs/celery.log
Open another terminal "Terminal 2" test:
[Root@drfdai ~]# Cd/data/www
[Root@drfdai www]# python
Python 2.6.6 (r266:84292, June 23 2015, 15:22:56)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
Type ' help ', ' copyright ', ' credits ' or ' license ' for the more information.
>>> from tasks Import output
>>> for I in Xrange (20):
... Output.apply_async (Args=[i])
...
<AsyncResult:38a14673-3ce6-4244-b262-d488aa637ff9>
<AsyncResult:3ff1b2c0-4b37-4c70-a543-d7c58111cb99>
<AsyncResult:355c232f-c419-43d5-bf8f-afa4d7084136>
<AsyncResult:c3089d05-d430-4fd4-bfba-112d6dd5b635>
<AsyncResult:842db1ad-5490-456d-82d8-e81e60db244b>
<AsyncResult:fe0dd54a-8bad-43e2-a8b8-b87be158cf16>
<AsyncResult:edd95275-a3b3-46a5-8e21-b88c7faedbe9>
<AsyncResult:c21b9c62-644d-4171-bb68-9a371af60283>
<AsyncResult:29d38896-9d4d-4b97-8e27-6a7209f06dc9>
<AsyncResult:7c3ba209-01fd-44d3-9645-f9c85d799eaf>
<AsyncResult:b08bbb7a-341b-436c-8ef4-be954b1fcbc5>
<AsyncResult:b03193a2-c6d6-49ff-ac27-b453593050c4>
<AsyncResult:1be6394c-09c3-4f48-9ed4-1114fd1af637>
<AsyncResult:b496cdd3-3032-41ad-939e-414198c9056a>
<AsyncResult:0dd365ea-383b-4c93-9ed0-406bcff6f41b>
<AsyncResult:ce60892e-d640-4589-8a8b-3a7f18e8c6d3>
<AsyncResult:df4fd3c4-d9e8-474b-b60f-c6e11283564a>
<AsyncResult:83e7d36c-fc42-4f3b-8941-df40fa13a517>
<AsyncResult:9b25f7f7-c3bc-4bb0-8a1f-27ceec59cf0c>
<AsyncResult:f0d7e201-032b-41a0-8e7a-c02681c31947>
Switch to Terminal 1 to see that a row of logs is printed every 2 seconds
Warnings.warn (cdeprecationwarning (w_pickle_deprecated))
[2015-11-16 05:58:09,687:warning/mainprocess] Celery@drfdai ready.
[2015-11-16 05:59:18,816:warning/worker-1] 0
[2015-11-16 05:59:19,826:warning/worker-1] 1
[2015-11-16 05:59:20,830:warning/worker-1] 2
[2015-11-16 05:59:21,836:warning/worker-1] 3
[2015-11-16 05:59:22,841:warning/worker-1] 4
[2015-11-16 05:59:23,848:warning/worker-1] 5
[2015-11-16 05:59:24,853:warning/worker-1] 6
[2015-11-16 05:59:25,858:warning/worker-1] 7
[2015-11-16 05:59:26,864:warning/worker-1] 8
[2015-11-16 05:59:27,873:warning/worker-1] 9
[2015-11-16 05:59:28,884:warning/worker-1] 10
[2015-11-16 05:59:29,892:warning/worker-1] 11
[2015-11-16 05:59:30,899:warning/worker-1] 12
[2015-11-16 05:59:31,908:warning/worker-1] 13
[2015-11-16 05:59:32,912:warning/worker-1] 14
[2015-11-16 05:59:33,920:warning/worker-1] 15
[2015-11-16 05:59:34,926:warning/worker-1] 16
[2015-11-16 05:59:35,933:warning/worker-1] 17
[2015-11-16 05:59:36,939:warning/worker-1] 18
[2015-11-16 05:59:37,944:warning/worker-1] 19
When we call this function in "Terminal 2", execution is not affected by Time.sleep (2), it is finished immediately after execution, and "Terminal 1" outputs 1 rows of logs every 2 seconds, which is the output of the function.
Useful Code Snippets:
Vim common/tasks.py
#导入发送邮件模块
From Email.mime.text import Mimetext
Import Smtplib
Import JSON
Import datetime
From celery import celery
From celery import platforms
Import requests
#导入配置文件
from config import setting
#发送邮箱配置
SMTP = SETTING.SMTP
user = Setting.user
passwd = setting.passwd
omitted here ...
@celery. Task
def sendMail (mail):
'''
To_list recipient mail, each email address with "," delimited, str format
Subuect Mail subject, str format
Context message content, str format
'''
To_list = Mail.get (' to_list ')
Subject = Mail.get (' subject ')
context = Mail.get (' context ')
msg = Mimetext (context, ' HTML ', ' Utf-8 ')
msg[' Subject '] = Subject
Msg[' from '] = "Linux system shipping <%s>"% user
To_list = List (To_list.split (', '))
Msg[' to '] = ",". Join ((to_list))
Try
s = smtplib. SMTP ()
S.connect (SMTP)
S.login (USER,PASSWD)
S.sendmail (user,to_list,msg.as_string ())
S.close ()
Return Json.dumps ({"Redid": 0, "redmsg": "Send Successfully"})
Except Exception,e:
print ' Error:%s '% E
Return Json.dumps ({"Redid":-1, "redmsg": "Send Failed"})
Call this method in the actual project, do the asynchronous mailbox send, do not need to wait for the return result after sending
#/usr/bin/env python
#coding: Utf-8
From Common.tasks import SendMail
Sendmail.delay (dict= (to_list = ' xxxxxxxx@qq.com ', subject = ' mail send Test ', context = '