Simple Python Queue with Redis

來源:互聯網
上載者:User
Simple Python Queue with Redis

Posted on June 3, 2012
#python#redis#nosql

The following article shows how to use
redis to build a simplemulti-producer, multi-consumer Queue with an interface similar to thepythonstandardlib queue. With this queue you can easily share databetween multiple processes or offload time
consumig calculations to multiple workerprocesses.

To store the data we use the
redis list data type. Redis Lists storessimple strings sorted by insertion order.

The following redis commands are used:

  • rpush Insert an element at the tail of the list
  • blpop Get an element from the head of the list, block if list is empty
  • lpop Get an element from the head of the list, return nothing list is empty
  • llen Return the length of the list

The implementation uses the
redis-py library to talk to the server.

import redisclass RedisQueue(object):    """Simple Queue with Redis Backend"""    def __init__(self, name, namespace='queue', **redis_kwargs):        """The default connection parameters are: host='localhost', port=6379, db=0"""       self.__db= redis.Redis(**redis_kwargs)       self.key = '%s:%s' %(namespace, name)    def qsize(self):        """Return the approximate size of the queue."""        return self.__db.llen(self.key)    def empty(self):        """Return True if the queue is empty, False otherwise."""        return self.qsize() == 0    def put(self, item):        """Put item into the queue."""        self.__db.rpush(self.key, item)    def get(self, block=True, timeout=None):        """Remove and return an item from the queue.         If optional args block is true and timeout is None (the default), block        if necessary until an item is available."""        if block:            item = self.__db.blpop(self.key, timeout=timeout)        else:            item = self.__db.lpop(self.key)        if item:            item = item[1]        return item    def get_nowait(self):        """Equivalent to get(False)."""        return self.get(False)

Usage:

>>> from RedisQueue import RedisQueue>>> q = RedisQueue('test')>>> q.put('hello world')

Now if we have a look at the redis database with theredis-cli client it showsthe expected results:

redis 127.0.0.1:6379> keys *1) "queue:test"redis 127.0.0.1:6379> type queue:testlistredis 127.0.0.1:6379> llen queue:test(integer) 1redis 127.0.0.1:6379> lrange queue:test 0 11) "hello world"

We can get the item from a different script with:

>>> from RedisQueue import RedisQueue>>> q = RedisQueue('test')>>> q.get()'hello world'

A subsequent call of q.get() will block until anotherone puts a new item intothe Queue.

The next step would be to an endoder/decoder (e.gpython-json) to theQueue so that you are not limited to send strings.

There alredy exists the nice and simple
hotqueue library which has thesame interface as the above example and provides encoding/decoding.

Other mentionable queue implementations with a redis backend are:

  • flask-redis A basic Message Queue with Redis for flask.
  • celery An asynchronous task queue/job queue based on distributedmessage passing. Much more advanced. Can be used with different storagebackends.
  • rq Simple python library for queueing jobs and processing them in the background with workers.
  • resque is a Redis-backed Ruby library for creating background jobs,placing them on multiple queues, and processing them later. Used at github. Includes a nice
    monitoring web interface.
  • pyres A resque clone in python.

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.