Celery must be instantiated before it can be used, as the instance is called application or short app. Instances are thread-safe, and multiple celery instances (different configurations, parts, and tasks) can run in a single process space.
Create one of the simplest apps:
>>> from celery import celery>>> app = celery () >>> app<celery __main__ at 0x7f6be52d0cd0 >
The app above is a celery instance running in the __main__ module.
Main Name
Celery send the task information, is not to send any source code, just send to perform the task name, and each worker maintains a task name to task specific function mapping, called task registration.
So each task must have a unique distinct name, and you can see how the default name of the task is generated:
>>> @app. Task ... def add (x, y): ... return x + y ... >>> add.nameu ' __main__.add '
The name of the visible task is the name of the instance running the module plus the name of the task function.
Now create an app instance in the Py file, tasks.py:
From celery import Celeryapp = celery () @app. Taskdef Add (x, y): return x + yif __name__ = = ' __main__ ': print add.name
app.worker_main ()
Creating celery instances directly in the shell, running modules directly, or running modules on the command line are all running in the main module:
$ python Tasks.py__main__.add
When importing a module using import, main name defines the name of the Celery instance module:
>>> from tasks import add>>> Add.nameu ' Tasks.add '
Running in the main module is the main name of the instance that can be specified manually:
>>> from celery import celery>>> app = celery (' tasks ') >>> app.main ' tasks '
The name of the task can also be specified:
>>> @app. Task (name= ' Sum-of-two-numbers ') >>> def add (x, y): ... return x + y>>> add.name ' sum-of-two-numbers '
Configuration
There are several ways to add configurations to an app instance:
When you create an app instance, initialize it:
App = celery (' tasks ', backend= ' redis://localhost:6379/0 ', ┆ ┆ ┆broker= ' redis://localhost:6379/0 ')
Use the App.conf property setting:
App.conf.result_backend = ' redis://localhost:6379/0 ' app.conf.broker_url = ' redis://localhost:6379/0 '
Update Multiple configurations:
>>> app.conf.update ( ... Enable_utc=true,... Timezone= ' Asia/shanghai ',...)
Use the configuration file to create a profile in the current directory or in a directory that Python can search for, ensuring that you can import,celeryconfig.py:
Result_backend = ' redis://localhost:6379/0 ' broker_url = ' redis://localhost:6379/0 '
And then:
App.config_from_object (' Celeryconfig ')
You can test if the configuration file is malformed:
$ python-m Celeryconfig
You can also create a configuration class:
Class Config: ENABLE_UTC = True timezone = ' Europe/london ' app.config_from_object (config)
Get from environment variables:
Import osfrom celery import celery#: Set default configuration Module Nameos.environ.setdefault (' Celery_config_module ', ' Celeryconfig ') app = celery () app.config_from_envvar (' Celery_config_module ')
To view the configuration:
>>> app.conf.humanize (With_defaults=false, censored=true) >>> app.conf.table (with_defaults=false , censored=true)
The first one is returned as a string, and the second one is returned as a dictionary.
With_defaults is set to True when you can view the default configuration, censored set to true to filter out sensitive information, including API, TOKEN, KEY, SECRET, PASS, SIGNATURE, DATABASE.
Laziness
App instances are delayed, and creating an instance will only set the app as the current app and will only be done when it's really needed.
Instances do not complete until the App.finalize () method is called or when the App.tasks property is accessed.
The finalizing instance replicates tasks that can be shared between apps, executes an indeterminate tasks adorner, and determines that all tasks are bound to the current app.
The App.task decorator does not actually create task tasks until the task is called or when the app finalize completes:
>>> from celery import celery>>> app = celery () >>> @app. Task ... def add (x, y): ... return x + y ... >>> add.__evaluated__ () false>>> repr (add) ' < @task: __main__.add of __main__ at 0x7f65 71694cd0> ' >>> add.__evaluated__ () True
When a task's __repr__ method is called, the task is actually created.
Breaking the chain
When using the current app, it's best to pass it as a parameter, called App chain, and better practices are:
Class Scheduler (object): def __init__ (self, app): Self.app = App
And not:
From celery import current_appclass Scheduler (object): def run: app = Current_app
The general wording:
From Celery.app import App_or_defaultclass Scheduler (object): def __init__ (self, app=none): Self.app = App_or_ Default (APP)
Development can be set at:
$ celery_trace_app=1 Celery worker-l Info
An exception is raise when the app chain breaks.
Abstract Tasks
Tasks created with the App.task adorner are inherited from the task class.
Can be customized:
From celery import taskclass debugtask (Task): def __call__ (self, *args, **kwargs): print (' task starting: {0. Name}[{0.request.id}] '. Format (self)- return super (Debugtask, self). __call__ (*args, **kwargs)
Then specify with the base parameter:
@app. Task (Base=debugtask) def add (x, y): return x + y
You can also use the app's task parameter to modify:
>>> app. Task = Debugtask
Celery (c) Example application