Recently, a work environment to do research, of course, the new company new style, need to study Python parallel distributed framework: celery, needless to say, dry Bai.
Then took the time to look, sure enough interface simple, easy to develop, 5 minutes to write an asynchronous send mail service.
Celery itself does not contain a messaging service, it uses a third-party messaging service to deliver tasks, and currently, celery supports messaging services with RABBITMQ, Redis, and even databases, and of course redis should be the best choice.
Task Execution Unit
A worker is a unit of task execution provided by celery, and the worker runs concurrently in a distributed System node.
Task Result Store
The task result store is used to store the results of tasks performed by the worker, and celery supports different ways to store the results of the task, including AMQP, redis,memcached, Mongodb,sqlalchemy, Django ORM, Apache Cassandra, Ironcache
In addition, celery supports different methods of concurrency and serialization.
-
concurrency
prefork, eventlet, gevent, Threads/single threaded
-
serialization of
pickle , json , yaml , msgpack . zlib , < Span style= "font-weight:600;" >BZIP2 compression, Cryptographic message signing, and so on
No nonsense, on the initial study of the code:
Import sysreload (SYS) Import timefrom celery Import Celeryapp = celery (' tasks ', broker = ' redis://localhost:6379/0 ') @ App.task () def sendmail (mail): print (' sending mail to%s ... '% mail[' to ']) Time.sleep (2.0) print (' mail sent. ')
I'm a direct-connect redis.
Then start the Celery Processing task:
Celery-a Tasks worker--loglevel=info
The above command line actually starts the worker, and if you want to run it in the background, you can throw it to supervisor.
How do I send a task? Very simple:
650) this.width=650; "Src=" https://s5.51cto.com/wyfs02/M01/8F/7D/wKioL1jhUDPj95uTAAR1BuJDw7E298.png-wh_500x0-wm_ 3-wmp_4-s_2576684765.png "title=" depth 20170403032501.png "alt=" Wkiol1jhudpj95utaar1bujdw7e298.png-wh_50 "/>
As you can see, Celery's API design is really simple.
Then, in the worker, you can see the message processed by the task:
650) this.width=650; "Src=" https://s5.51cto.com/wyfs02/M00/8F/7F/wKiom1jhUH2AT-NgAAS4nmOeEXs818.png-wh_500x0-wm_ 3-wmp_4-s_1934919965.png "title=" depth 20170403032629.png "alt=" Wkiom1jhuh2at-ngaas4nmoeexs818.png-wh_50 "/>
Here we can see that each task has a unique id,task executed asynchronously on the worker.
Celery the default settings to meet the basic requirements. The worker is started in pool mode, the default size is the number of CPU cores, the default serialization mechanism is pickle, but can be specified as JSON. Because Python calls the Unix/linux program is too easy, it is very appropriate to use celery as an asynchronous task framework.
Celery also has some advanced usage, such as combining multiple tasks into an atomic task, and a complete monitoring interface that will be available for further research.
This article is from the "Microsoft" blog, so be sure to keep this source http://1238306.blog.51cto.com/1228306/1912637
A preliminary study of Python celery