Python's distributed task Huey how to implement asynchronous task explanation
In this article we are going to share a Python lightweight task queue program that allows Python's distributed task Huey to implement asynchronous tasks that interested friends can look at.
A lightweight task queue, function and related broker is not celery powerful, heavy on the light, and the code is easier to read.
Introduction to Huey: (lighter than celery, MRQ, RQ to use!) )
A lightweight alternative.
Written in Python
No deps outside Stdlib, except Redis (or roll your own backend)
Support for Django
Supports
Multi-threaded Task execution
Scheduled execution at a given time
Periodic execution, like a crontab
Retrying tasks that fail
Task Result Storage
Installation:
The code is as follows |
|
Installing Huey can be installed very easily using PIP.
Pip Install Huey Huey have no dependencies outside the standard library, but currently the only fully-implemented queue backend it ships wit H requires Redis. To use the Redis backend, you'll need to install the Python client.
Pip Install Redis Using git If you want to run the very latest, feel free to pull down the repo from GitHub and install by hand.
git clone https://github.com/coleifer/huey.git CD Huey Python setup.py Install You can run the tests using the Test-runner:
Python setup.py test |
About the Huey API, the following are detailed introduction and parameter introduction.
The code is as follows |
|
From Huey Import Redishuey, Crontab
Huey = Redishuey (' My-app ', host= ' redis.myapp.com ')
@huey. Task () Def add_numbers (A, B): Return a + b
@huey. Periodic_task (crontab (minute= ' 0 ', hour= ' 3 ')) Def nightly_backup (): Sync_all_data () |
Juey as a woker, some CLI parameters.
Commonly used are:
-l about log file execution.
The number of-w workers, the value of-w is large, it must be increased the processing capacity of the task
-P--periodic Start Huey worker, he will find the task that needs crontab from tasks.py, will send several threads to deal with these things specifically.
-N does not start the pre-cycle execution on the crontab, only when you trigger, the task of the cycle week will be executed.
--threads means you know.
1
The code is as follows |
|
Original The following table lists the options available for the consumer as well as their default values. -L,--logfile Path to file used for logging. When a file was specified, by default Huey would use a rotating file handler (1mb/chunk) with a maximum of 3 backups. You can attach your own handler (Huey.logger) as well. The default loglevel is INFO. -V,--verbose Verbose logging (equates to DEBUG level). If no logfile is specified and verbose are set, then the consumer would log to the console. This is the very useful for testing/debugging. -Q,--quiet Only log errors. The default loglevel for the consumer is INFO. -W,--workers Number of worker threads, the default is 1 thread and applications that has many I/O bound tasks, increasing this num ber may leads to greater throughput. -P,--periodic Indicate that this consumer process should start a thread dedicated to enqueueing "periodic" Tasks (Crontab-like Functiona lity). This is defaults to True and so should not need to being specified in practice. -N,--no-periodic Indicate that this consumer process should not enqueue periodic tasks. -D,--delay When using a ' polling '-type queue backend, the amount of time to wait between polling the backend. Default is 0.1 seconds. -M,--max-delay The maximum amount of time to wait between polling, if using weighted backoff. Default is ten seconds. -B,--backoff The amount to Back-off is polling for results. Must be greater than one. Default is 1.15. -U,--UTC Indicates the consumer should use UTC time to all tasks, crontabs and scheduling. Default is True and so in practice you should not need to specify this option. --localtime Indicates the consumer should use localtime for all tasks, crontabs and scheduling. Default is False. Examples Running the consumer with 8 threads, a logfile for errors only, and a very short polling interval: huey_consumer.py my.app.huey-l/var/log/app.huey.log-w 8-b 1.1-m 1.0 |
Task Queue Huey is relying on Redis to implement the queue's task store, so we need to put redis-server and redis-py in advance first. Installation of the method will not say, you search it.
We first create a link instance under Huey:
The code is as follows |
|
# config.py From Huey Import Huey From Huey.backends.redis_backend import Redisblockingqueue
Queue = Redisblockingqueue (' test-queue ', host= ' localhost ', port=6379) Huey = Huey (Queue) |
Then it is about the task, that is, who you want to be in the queue of tasks, and Celey, RQ,MRQ like, are expressed in tasks.py.
The code is as follows |
|
from config Import Huey # import the Huey we instantiated in config.py
@huey. Task () def count_beans (num): print '--counted%s beans--'% num |
Another one that really goes into execution. main.py equivalent to the producer, tasks.py the equivalent of the consumer relationship. Main.py is responsible for feeding the data.
The code is as follows |
|
main.py from config Import Huey # import our "Huey" object From tasks Import Count_beans # import our task
if __name__ = = ' __main__ ': Beans = raw_input (' How many beans? ') Count_beans (int (beans)) print ' enqueued job to count%s beans '% beans
Ensure you have Redis running locally
Ensure you have installed Huey
Start the consumer:huey_consumer.py main.huey (notice this is "Main.huey" and "Config.huey").
Run the main Program:python main.py |
As with celery, RQ, his result acquisition is the need to indicate his storage in your config.py or main code, and now Huey is only supporting Redis, but relative to his features and volume, this is enough!
Just a few words, import Redisdatastore Library, declare the address stored.
The code is as follows |
|
From Huey Import Huey From Huey.backends.redis_backend import Redisblockingqueue From huey.backends.redis_backend import Redisdatastore # ADD this line
Queue = Redisblockingqueue (' test-queue ', host= ' localhost ', port=6379) Result_store = Redisdatastore (' Results ', host= ' localhost ', port=6379) # ADDED
Huey = Huey (Queue, Result_store=result_store) # ADDED result Store |
At this time, when we try again in Ipython, we will find that we can get the return value inside the tasks.py, but when you get it in main.py, he still takes it out of Redis through the UUID.
The code is as follows |
|
>>> from main import Count_beans >>> res = Count_beans (100) >>> Res # What is "res"? >>> Res.get () # Get the result of this task ' Counted beans ' |
Huey is also a feature that supports Celey's deferred execution and crontab. These functions are very important, can be customized priority or no longer rely on the crontab of Linux itself.
Usage is very simple, add a delay time on the line, looked at the source of the next Huey, he was immediately executed by default. Of course, it depends on whether your thread is in a pending state.
The code is as follows |
|
>>> Import datetime >>> res = Count_beans.schedule (args= (), delay=60) >>> Res >>> Res.get () # This returns None, no data are ready >>> Res.get () # Still no data ... >>> Res.get (blocking=true) # OK, let's just block until its ready ' Counted beans ' |
One more retry retry introduction, Huey also has retry, this is a practical thing. If you have seen my above article about the celery retry mechanism introduction, should also be able to understand how Huey is a matter. Yes, he is in fact in the tasks in front of the specific function of the decorator, the adorner has a Func try exception retry logic. You know that.
The code is as follows |
|
# tasks.py From datetime import datetime
from config Import Huey
@huey. Task (retries=3, retry_delay=10) Def try_thrice (): print ' trying....%s '% DateTime.Now () Raise Exception (' Nope ') |
Huey is to give you a chance to go hungry ~ that is to say, you have done the Deley of the scheduled task, if you want to cancel, that good-looking, direct revoke on it.
The code is as follows |
|
# count Some beans res = Count_beans (10000000)
Res.revoke () The same applies to tasks is scheduled in the future:
res = Count_beans.schedule (args= (100000,), Eta=in_the_future) Res.revoke ()
@huey. Task (crontab (minute= ' *)) Def print_time (): Print DateTime.Now () |
Task ()-a transparent adorner that makes your function graceful.
Periodic_task ()-This is a recurring task.
Crontab ()-A periodic task that comes with the crontab when the worker is started.
Basequeue-Task Queue
Basedatastore-After the task is executed, the results can be plugged in. Basedatastore can rewrite it by itself.
The Official Huey git repository provides the relevant test code:
main.py
The code is as follows |
|
from config Import Huey From tasks Import Count_beans
if __name__ = = ' __main__ ': Beans = raw_input (' How many beans? ') Count_beans (int (beans)) Print (' enqueued job to count%s beans '% beans) |
tasks.py
code as follows |
|
import random Import Time fr Om Huey Import crontab from config import Huey @huey. Task () def count_beans (num): Print "St Art ... " Print ('--counted%s beans--'% num) Time.sleep (3) print" End ... " return ' counted%s beans '% num @huey. Periodic_task (crontab (minute= ' */5 ')) Def every_five_mins (): Print (' Consumer prints this every 5 Mins ') @huey. Task (retries=3, retry_delay=10) def try_thrice (): If Random.randint (1, 3) = = 1: Print (' OK ') Else: Print (' About to fail, would retry in seconds ') Raise Exception (' Crap something went wrong ') br> @huey. Task () Def slow (n): Time.sleep (n) print (' slept%s '% n) |
run.sh
The code is as follows |
|
#!/bin/bash echo "HUEY CONSUMER" echo "-------------" echo "In another terminal, run ' python main.py '" echo "Stop the consumer using CTRL + C" Pythonpath=.: $PYTHONPATH Python.. /.. /huey/bin/huey_consumer.py Main.huey--threads=2
= |
We can first clone the Huey code base. There's a examples example directory where you can see that he supports Django, but that's not the point!
The code is as follows |
|
[Xiaorui@devops/tmp]$ git clone https://github.com/coleifer/huey.git Cloning into ' Huey ' ... Remote:counting objects:1423, done. Remote:compressing objects:100% (9/9), done. Receiving objects:34% (497/1423), 388.00 KiB | 29.00 kib/s kib/s Receiving objects:34% (498/1423), 628.00 KiB | 22.00 kib/s Remote:total 1423 (Delta 0), reused 0 (Delta 0) Receiving objects:100% (1423/1423), 2.24 MiB | 29.00 kib/s, done. Resolving deltas:100% (729/729), done. Checking connectivity ... done. [Xiaorui@devops/tmp] $CD huey/examples/simple [Xiaorui@devops Simple (master)]$ ll Total 40 -rw-r--r--1 Xiaorui Wheel 79B 9 8 08:49 README -rw-r--r--1 Xiaorui Wheel 0B 9 8 08:49 __init__.py -rw-r--r--1 Xiaorui Wheel 56B 9 8 08:49 config.py -rwxr-xr-x 1 Xiaorui Wheel 227B 9 8 08:49 cons.sh -rw-r--r--1 Xiaorui Wheel 205B 9 8 08:49 main.py -rw-r--r--1 Xiaorui Wheel 607B 9 8 08:49 tasks.py [Xiaorui@devops Simple (Master)]$ |
http://www.bkjia.com/PHPjc/968078.html www.bkjia.com true http://www.bkjia.com/PHPjc/968078.html techarticle Python's distributed task Huey how to implement asynchronous tasks this article lets us share a Python lightweight task queue program that allows Python's distributed task Huey to be implemented asynchronously ...