Django & celery–easy Async Task Processing translation

Source: Internet
Author: User
Tags install django pip install django install redis redis server

So, while developing a Web application, there comes a time if we need to process some of the tasks in the background, PE Rhaps asynchronously. For example, your user would upload photos and the app would post them to multiple social networks. We would definitely want to offload the uploading task to some background workers.

Then, when developing a Web page, we need to perform some time-consuming tasks in the background, perhaps asynchronously. For example, your users will upload photos and apps will push them to multiple social networks. We must put the task of uploading to some behind the workers to complete.

Django and celery makes background task processing a breeze. In this article, we shall see how we can setup Django and celery-to-start processing our background tasks. We would use Redis to maintain our task queue.

Django and celery make background tasks easier. In this article, we'll see how to install Django and celery to start working on our background tasks. We will use Redis to maintain our task queue.

How does does it work?

We define some tasks in our application. These tasks is expected to run for a pretty long time.
We run the celery workers. Celery knows how to find and load these tasks. The workers keep waiting on us.
We add some jobs to the workers queue from our web app. The workers now has something to work on. So they start taking the jobs from the queue and start processing them.
We can query the status of the jobs from our web apps to know whats happening.
The easy-to-use Python API is makes it really simple to use. You don ' t need any specialisation or anything in Redis.
Setting up

Let ' s first install the Redis server:

sudo apt-get install Redis-server
sudo apt-get install Redis-server

The version that comes from Ubuntu official repo was quite old. You can install the latest version from 3rd party PPAs.

Install celery with Redis support:

Pip Install Celery-with-redis
Pip Install Celery-with-redis

And then install the Django-celery package:

Pip Install Django-celery
Pip Install Django-celery

Configuration

Add "Djcelery" to your installed apps list:

Installed_apps = (
' Django.contrib.auth ',
' Django.contrib.contenttypes ',
' Django.contrib.sessions ',
' Django.contrib.sites ',
' Django.contrib.messages ',
' Django.contrib.staticfiles ',

‘app‘,‘djcelery‘,  # Must be added to the INSTALLED_APPS‘south‘,

)
Installed_apps = (

 ‘django.contrib.auth‘ , ‘django.contrib.contenttypes‘ , ‘django.contrib.sessions‘ , ‘django.contrib.sites‘ , ‘django.contrib.messages‘ , ‘django.contrib.staticfiles‘ , ‘app‘ , ‘djcelery‘ ,    # Must be added to the INSTALLED_APPS ‘south‘ ,

)

Modify your main app ' s settings.py file to add the celery specific settings:

Import Djcelery
Djcelery.setup_loader ()

Broker_url = ' redis://localhost:6379/0 '
Celery_result_backend = ' redis://localhost:6379/0 '
Celery_accept_content = [' json ']
Celery_task_serializer = ' json '
Celery_result_serializer = ' json '
Import Djcelery

Djcelery. Setup_loader ()

Broker_url = ' redis://localhost:6379/0 '

Celery_result_backend = ' redis://localhost:6379/0 '

Celery_accept_content = [' json ']

Celery_task_serializer = ' json '

Celery_result_serializer = ' json '

Now, inside your main application directory (the directory in which settings.py is located), create a file named "CELERY.P Y "with these contents:

From the future import Absolute_import

Import OS
From celery import celery
From django.conf Import settings

Set the default Django settings module for the ' Celery ' program.

Os.environ.setdefault (' Django_settings_module ', ' project.settings ')

App = celery (' project ')

Using a string here means the worker is not having topickle the object when using Windows.

App.config_from_object (' django.conf:settings ')
App.autodiscover_tasks (lambda:settings. Installed_apps)
From the future import Absolute_import

Import OS

From celery import celery

From Django. Conf Import Settings

Set the default Django settings module for the ' Celery ' program.

Os. Environ. SetDefault (' Django_settings_module ', ' project.settings ')

App = celery (' project ')

Using a string here means the worker is not having topickle the object when using Windows.

App. Config_from_object (' django.conf:settings ')

App. Autodiscover_tasks (lambda:settings. Installed_apps)

The above codes do a few things:

It creates our own celery instance.
We Ask the celery instance to load necessary configs from our project ' s settings file.
We make the instance auto Discover tasks from our Installed_apps.
Also Let's modify the "init. py" file in the same directory to make the celery app available more easily:

From the future import Absolute_import
From. Celery Import app as Celery_app
From the future import Absolute_import

From. Celery Import app as Celery_app

This would allow us to use the same app. instance for shared tasks across reusable Django apps.

Defining Tasks

Now let's create a tasks.py file in one of our Installed_apps and add these contents:

From Project Import Celery_app
From time import sleep

@celery_app. Task ()
def uploadtask (message):

# Update the state. The meta data is available in task.info dicttionary# The meta data is useful to store relevant information to the task# Here we are storing the upload progress in the meta. UploadTask.update_state(state=‘PROGRESS‘, meta={‘progress‘: 0})sleep(30)UploadTask.update_state(state=‘PROGRESS‘, meta={‘progress‘: 30})sleep(30)return message

def get_task_status (task_id):

# If you have a task_id, this is how you query that task task = UploadTask.AsyncResult(task_id)status = task.statusprogress = 0if status == u‘SUCCESS‘:    progress = 100elif status == u‘FAILURE‘:    progress = 0elif status == ‘PROGRESS‘:    progress = task.info[‘progress‘]return {‘status‘: status, ‘progress‘: progress}

From Project Import Celery_app

From time import sleep

@ Celery_app. Task ()

def uploadtask (message):

 # Update the state. The meta data is available in task.info dicttionary # The meta data is useful to store relevant information to the task # Here we are storing the upload progress in the meta. UploadTask . update_state ( state = ‘PROGRESS‘ , meta = { ‘progress‘ : 0 } ) sleep ( 30 ) UploadTask . update_state ( state = ‘PROGRESS‘ , meta = { ‘progress‘ : 30 } ) sleep ( 30 ) return message

def get_task_status (task_id):

 # If you have a task_id, this is how you query that task task = UploadTask . AsyncResult ( task_id ) status = task . status progress = 0 if status == u ‘SUCCESS‘ :     progress = 100 elif status == u ‘FAILURE‘ :     progress = 0 elif status == ‘PROGRESS‘ :     progress = task . info [ ‘progress‘ ] return { ‘status‘ : status , ‘progress‘ : progress }

Now we are defined our own celery app and we have our tasks. It's now time to launch the workers and start adding tasks.

Processing Tasks

Before we can start processing tasks, we have to launch the celery daemon first. This is what we do it:

Celery Worker–app=project.celery:app–loglevel=info
Celery Worker–app = project. Celery:app–loglevel = INFO

Here, we are celery to use the celery instance we defined and configured earlier. Here "Project" was the main app, the package that contains our settings.py along with celery.py. The "app" the variable name which holds the celery instance.

Now let's use the Django shell to add and query jobs:

$ python manage.py Shell

[Snipped]

From app.tasks Import *

Please notice the "delay" method, which are a handy shortcut to Apply_async.
It allows us to call the task with exactly the same parameters
As the original function. If you need + custom options, use Apply_async.

t = uploadtask.delay ("Hello world!")

T is now a AsyncResult object. T.id is the task ID for the task
You can directly use T to query the task. Say-t.status

Get_task_status (t.id)
{' Status ': U ' PROGRESS ', ' PROGRESS ': 0}

(after secs delay)

Get_task_status (t.id)
{' Status ': U ' PROGRESS ', ' PROGRESS ': 30}

(After waiting for another + secs or so)

Get_task_status (t.id)
{' Status ': U ' SUCCESS ', ' Progress ': 100}
$ python manage. PY Shell

[Snipped]

From app. Tasks Import *

Please notice the "delay" method, which are a handy shortcut to Apply_async.

It allows us to call the task with exactly the same parameters

As the original function. If you need + custom options, use Apply_async.

t = uploadtask. Delay ("Hello world!")

T is now a AsyncResult object. T.id is the task ID for the task

You can directly use T to query the task. Say-t.status

Get_task_status (t. id)

{' Status ': U ' PROGRESS ', ' PROGRESS ': 0}

(after secs delay)

Get_task_status (t. id)

{' Status ': U ' PROGRESS ', ' PROGRESS ': 30}

(After waiting for another + secs or so)

Get_task_status (t. id)

{' Status ': U ' SUCCESS ', ' Progress ': 100}

As we can see, the out task is processed by celery. And we could easily query the status. We would generally use the meta data to store any task related information.

Django & celery–easy Async Task Processing translation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.