Catalogue
Directory pre-list preface task's instantiation task the name of the task's binding task's request context task's retry tasks inheritance
List of previous texts
Distributed task Queue celery
Distributed task Queue celery--detailed workflow
Introduction to Distributed task Queue celery--Application Foundation
Follow the previous article to get a deeper understanding of celery Tasks. The sample code is still modified on the basis of the previous article.
Tasks is the cornerstone of celery, and the prototype class is Celery.app.task:Task, which provides two core functions: To send a task message to a queue declares the instance of a specific function task that the Worker needs to perform after receiving the message
by decorating a normal function with the adorner app.task, you can easily create a task function.
From proj.celery Import app
@app. Task
def Add (x, y):
return x + y
It is important to note that the task function is essentially no longer a normal function, but rather a celery.app.task:Task instance object.
>>> from proj.task.tasks import add
>>> dir (add)
[' AsyncResult ', ' maxretriesexceedederror ' , ..., ' apply_async ', ' delay ', U ' name ', ' On_bound ', ' on_failure ', ' on_retry ', ' on_success ', ' request ', ' Retry ', ' subtask ' , ...]
Therefore, task functions can call instance properties and methods such as Delay/apply_async that are part of a task.
>>> add.apply_async
<bound method Add.apply_async of < @task: Proj.task.tasks.add of Proj at 0x7fedb363a790>>
>>> add.delay
<bound method Add.delay of < @task: Proj.task.tasks.add of Proj at 0x7fedb363a790>>
This is a very important understanding of "app.task's" adornment "action, which is actually the instantiation of the task", and the adorner parameter is the task initialization parameter.
The official documentation (HTTP://DOCS.CELERYPROJECT.ORG/EN/LATEST/USERGUIDE/TASKS.HTML#LIST-OF-OPTIONS) provides a complete list of app.task adorner parameters.
In addition, we also need to pay attention to "multi-adorner order" pit, App.task should always be put to the last use, in order to ensure its stable and effective.
App.task
@decorator2
@decorator1
def add (x, y):
return x + y
the name of the task
Each task function has a unique name, which is included in the task message, and the Worker finds the task function that is executed by that name. By default, task enables automatic naming, which takes the full pathname of the function as the task name.
>>> add.name
u ' proj.task.tasks.add '
Of course, you can also specify the task name by specifying the adorner parameter name.
@app. Task (name= ' new_name ')
def add (x, y):
return x + y
>>> from proj.task.tasks Import add
>>> add.name
' new_name '
But in order to avoid the problem of naming conflicts, it is generally not advisable to do so unless you know exactly what you are doing. Binding of tasks
Since the task function is essentially a task instance object, you can of course apply the self binding attribute.
# Enable bindings:
@app. Task (bind=true)
def add (self, x, y):
print (' Self: ', self)
return x + y
>>> Add.delay (1, 2)
<AsyncResult:1982dc85-694b-4ceb-849b-5f69e40b4fe9>
Binding an object self is important, and many of the advanced functions of a Task are invoked by it as a carrier. For example: Task retry function, request context function. Retry of the task
The implementation of the Task Retry feature is Task.retry, which re-sends the task message to the same queue to restart the task.
@app. Task (Bind=true, max_retries=3)
def send_twitter_status (self, OAuth, tweet):
try:
twitter = Twitter ( OAuth)
twitter.update_status (tweet)
except (Twitter.failwhaleerror, Twitter.loginerror) as exc:
raise Self.retry (EXC=EXC)
max_retries Specifies the maximum number of retries exc specifies that the exception information be output to the log, which requires the result backend to be turned on.
If you want to retry only when a specific exception is triggered, you can apply the "automatic retry for known exceptions" attribute of the Task.
# The task will be retried only if the failwhaleerror exception is triggered, and the maximum of 5 attempts.
@app. Task (autoretry_for= (Failwhaleerror,), retry_kwargs={' Max_retries ': 5})
def refresh_timeline (user):
return Twitter.refresh_timeline (user)
request context for a task
When celery requests a Worker to execute a task function, the context of the request is provided so that the task function can access the task State and information contained in the context during execution.
@app. Task (bind=true)
def dump_context (self, x, y):
print (' Executing task ID {0.id}, args: {0.args!r} Kwargs: {0. KWARGS!R} '. Format (
self.request))
>>> from proj.task.tasks import dump_context
>>> dump_context.delay (1, 2)
<asyncresult: 00bc9f96-98df-4bca-a4a3-4774c535a44c>
Capturing useful information in the context of a request helps us to identify and investigate the execution of a task.
A complete list of context attributes can be found in the official documentation (HTTP://DOCS.CELERYPROJECT.ORG/EN/LATEST/USERGUIDE/TASKS.HTML#TASK-REQUEST). Inheritance of tasks
The
uses the App.task adorner to instantiate the native task class by default, although this satisfies most of the application scenario requirements, but not all. So celery allows us to derive the base class of the special alienation by inheriting the Task class. This feature will be very effective in complex application scenarios. Re-specifies the default base class for all tasks that are applicable to
def make_app (context): app = celery (' proj ') app.config_from_object (' proj.celeryconfig ') Default_exchange = Ex Change (' Default ', type= ' direct ') Web_exchange = Exchange (' Task ', type= ' direct ') App.conf.task_default_queue = ' Def Ault ' App.conf.task_default_exchange = ' default ' App.conf.task_default_routing_key = ' default ' app.conf.task_ Queues = (Queue (' Default ', Default_exchange, routing_key= ' default '), Queue (' High_queue ', Web_exchange, RO uting_key= ' Hign_task '), Queue (' Low_queue ', Web_exchange, routing_key= ' Low_task '), App.conf.timezone = '
Asia/shanghai ' app.conf.beat_schedule = {' Periodic_task_add ': {' task ': ' Proj.task.tasks.add ', ' Schedule ': crontab (minute= ' */1 '), ' args ': (2, 2)},} taskbase = App. Task class Contexttask (taskbase): abstract = True context = ctx def __call__ (self, *args, * *kwargs): "" would be execUte when create the instance object of Contexttesk. "" "Log.info (_li (" invoked celery task starting:% (name) s[% (ID) s] "), {' name ': Self.name, '
Id ': self.request.id}) return super (Contexttask, self). __call__ (*args, **kwargs) # What to do before the task executes successfully
def on_success (self, retval, task_id, args, Kwargs): "" Invoked after the task is successfully execute. "" "Log.info (_li (" Task% (ID) s success: [% (ret) s]. "), {' id ': task_id, ' ret ': Retva
L}) return super (Contexttask, self). On_success (retval, task_id,
args, Kwargs) # What to do after a task execution fails def on_failure (self, exc, task_id, args, Kwargs, einfo):
"" "invoked after the task failed to execute. "" "msg = _le (" Task [% (ID) s] failed:\n "" Args:% (args) s\n "" Kwarg
S:% (kw) s\n " "Detail:% (err) s")% {' id ': task_id, ' args ': args, ' kw ': kw args, ' err ': Six.text_type (EXC)} log.exception (msg) return super (Contexttask, self). On_failure (exc , task_id, args, Kwargs, Einfo) # re-assigns the default base class app. Task = contexttask return app
Inherit a base class with biased attributes for different types of tasks
Import celery
class jsontask (celery. Task):
serializer = ' json '
def on_failure (self, exc, task_id, args, Kwargs, einfo):
print (' {0!r} failed: {1!) R} '. Format (task_id, exc))
class XMLTask (celery. Task):
serializer = ' xml '
def on_failure (self, exc, task_id, args, Kwargs, einfo):
print (' {0!r} failed: {1!r } '. Format (task_id, exc)
# Specifies a different base class
@task (base=jsontask)
def add_json (x, y):
raise Keyerror ()
# Specify a different base class
@task (base=xmltask)
def add_xml (x, y):
raise Keyerror ()