The next step part of the study celery.
Create a Python module first:
mkdir PROJCD Projtouch __init__.py
Create the celery.py in the Proj directory:
From __future__ import absolute_importfrom celery Import Celeryapp = celery (' proj ', broker= ' amqp://', Backend= ' amqp://', include=[' Proj.tasks ']) # Optional configuration, see the application User Guide.app.con F.update (celery_task_result_expires=3600, celery_task_serializer= ' json ', celery_accept_content=[' json '), CELE Ry_result_serializer= ' json ') if __name__ = = ' __main__ ': App.start ()
Analytical:
App=celery (' proj '), name this module as ' Proj ' and refer to the main name section of the User Guide in detail
Broker= ' amqp://', specify broker, here is RABBITMQ. Because RABBITMQ Default user is guest (password is guest), you can also write: Amqp://[email protected]//
Backend= ' amqp://', specify a backend, you must set a backend if you need to check what is returned after the worker performs the task
App.conf.update (...) ) to modify the configuration in the program, the best practice is to put the configuration in a separate file. The modified content is the time that result was saved when AMQP was used as backend. It seems to be in seconds.
Include[] is to specify the file to import
Note: By default, Celery uses pickle as the payload, and I test with the root user, which prompts for security issues and is not executed. Thus I need to configure Celery_task_serializer,celery_accept_content,celery_result_serializer for JSON. (Please also refer to my other article blog:http://my.oschina.net/hochikong/blog/393270
In the same directory, create the tasks.py:
From __future__ import absolute_importfrom proj.celery import app@app.taskdef Add (x, y): return x + y@app.taskdef Mul (x , y): return x * y@app.taskdef xsum (numbers): return sum (Numbers)
Start worker:
Celery-a proj worker-l Info
Tips are as follows:
can see:
Runtimewarning:you is running the worker with Superuser privileges, which isabsolutely not recommended!
Please refer to the celery documentation for other related resolutions.
In the native delivery task to the worker (a new terminal is opened):
[Email protected]:~/celeryapp# lscagent.py cagent.pyc config.py config.pyc proj test.py
You can see the Proj directory
Enter the Python interpreter:
[Email protected]:~/celeryapp# pythonpython 2.7.6 (default, Mar, 22:59:56) [GCC 4.8.2] on Linux2type ' help ', ' copy Right "," credits "or" license "for more information.>>> from proj.agent Import add #我的tasks. py renamed to Age nt.py>>> res = add.delay (2,5) >>>
Note the import form: Because the specific function is saved in tasks.py (mine is agent.py), the import method should be: Module.path:attribute (see celery for import explanation)
We call delay (2,5) to execute the Add function (you can also use Apply_async (), but the parameters need to be passed in the tuple), and the passed arguments are 2 and 5, which returns an object res
>>> res<asyncresult:c08c72ed-8566-4025-b7f5-6ea5a9137966>
Call the Get () method to get the result of the operation:
>>> Res.get () 7
Note that if the task executes for a long time, get () needs to set a timeout, for example: Get (timeout=1)
To get information about an arithmetic task:
>>> res.stateu ' SUCCESS '
Returns a Unicode string success
Let's look at another terminal, the one that initiates the worker:
[2015-04-04 14:41:29,772:info/mainprocess] Received task:proj.agent.add[60eea8f6-0b6a-4bb4-909f-60a377936dcc][2015-04-04 14:41:29,798:info/mainprocess] Task proj.agent.add[60eea8f6-0b6a-4bb4-909f-60[2015-04-04 15:05:38,847:info/mainprocess] Received task:proj.agent.add[ c08c72ed-8566-4025-b7f5-6ea5a9137966][2015-04-04 15:05:38,867:info/mainprocess] Task proj.agent.add[ C08C72ED-8566-4025-B7F5-6EA5A9137966] succeeded in 0.0136815000001s:7
You can see the information for the task execution.
To close the worker, press CTRL + C directly to
#SORA #celery Practice 1