Python's Flask Framework application method for calling Redis queue data

Source: Internet
Author: User
Tags message queue

Transferred from: http://www.jb51.net/article/86021.htm

Task Async
Open the browser, enter the address, press ENTER, open the page. An HTTP request is then sent by the client to the server, the server processes the request, and the response (response) content is returned.

We are browsing the web every day, sending large and small requests to the server. Sometimes the server gets a request and finds that he needs to send a request to another server, or the server needs to do something else, so the initial request is blocked, which is to wait for the server to do something else.

More often, the extra things the server does do not require the client to wait, and this can be done asynchronously. There are many tools for working with asynchronous tasks. The main principle is to process notification messages, usually taken as queue structures for notification messages. Produce and consume messages for communication and business implementation.

Production consumption and queuing
The implementation of the above asynchronous task can be abstracted into the producer consumption model. Like a restaurant, cooks are cooking, food eaters are eating. If the chef does a lot and sells for a while, the chef will rest; If the customer is a lot, the chef is busy and the customer needs to wait. There are many ways to implement producers and consumers, using the Python standard library queue below to write a small example:

?
1234567891011121314151617181920212223242526272829303132 import randomimport timefrom Queue import Queuefrom threading import Threadqueue = Queue(10)class Producer(Thread):  def run(self):    while True:      elem = random.randrange(9)      queue.put(elem)      print "厨师 {} 做了 {} 饭 --- 还剩 {} 饭没卖完".format(self.name, elem, queue.qsize())      time.sleep(random.random())class Consumer(Thread):  def run(self):    while True:      elem = queue.get()      print "吃货{} 吃了 {} 饭 --- 还有 {} 饭可以吃".format(self.name, elem, queue.qsize())      time.sleep(random.random()) def main():  for i in range(3):    p = Producer()    p.start()  for i in range(2):    c = Consumer()    c.start()if __name__ == ‘__main__‘:  main()

The approximate output is as follows:

?
12345678910111213141516 厨师 Thread-1 做了 1 饭 --- 还剩 1 饭没卖完厨师 Thread-2 做了 8 饭 --- 还剩 2 饭没卖完厨师 Thread-3 做了 3 饭 --- 还剩 3 饭没卖完吃货Thread-4 吃了 1 饭 --- 还有 2 饭可以吃吃货Thread-5 吃了 8 饭 --- 还有 1 饭可以吃吃货Thread-4 吃了 3 饭 --- 还有 0 饭可以吃厨师 Thread-1 做了 0 饭 --- 还剩 1 饭没卖完厨师 Thread-2 做了 0 饭 --- 还剩 2 饭没卖完厨师 Thread-1 做了 1 饭 --- 还剩 3 饭没卖完厨师 Thread-1 做了 1 饭 --- 还剩 4 饭没卖完吃货Thread-4 吃了 0 饭 --- 还有 3 饭可以吃厨师 Thread-3 做了 3 饭 --- 还剩 4 饭没卖完吃货Thread-5 吃了 0 饭 --- 还有 3 饭可以吃吃货Thread-5 吃了 1 饭 --- 还有 2 饭可以吃厨师 Thread-2 做了 8 饭 --- 还剩 3 饭没卖完厨师 Thread-2 做了 8 饭 --- 还剩 4 饭没卖完

Redis queue
Python has a handy queue structure built in. We can also implement similar operations with Redis. And do a simple asynchronous task.

Redis provides two ways to make a message queue. One is to use the producer consumption mode, and the other is to publish the subscriber pattern. The former will allow one or more clients to listen to the message queue, once the message arrives, consumers immediately consume, who first grabbed the count, if there is no message in the queue, then consumers continue to monitor. The latter is also one or more clients subscribing to the message channel, so long as the publisher publishes the message, all Subscribers are able to receive the message and the subscribers are ping.

Production and consumption patterns
The main use of Redis-provided blpop to get the queue data, if the queue does not have data blocking the wait, that is, listening.

?
123456789101112131415 import redis class Task(object):  def __init__(self):    self.rcon = redis.StrictRedis(host=‘localhost‘, db=5)    self.queue = ‘task:prodcons:queue‘  def listen_task(self):    while True:      task = self.rcon.blpop(self.queue, 0)[1]      print "Task get", taskif __name__ == ‘__main__‘:  print ‘listen task queue‘  Task().listen_task()

Publish subscription Mode
Using Redis's pubsub feature, subscribers subscribe to channels, publishers post messages to channels, and channels are a message queue.

?
123456789101112131415161718 import redisclass Task(object):  def __init__(self):    self.rcon = redis.StrictRedis(host=‘localhost‘, db=5)    self.ps = self.rcon.pubsub()    self.ps.subscribe(‘task:pubsub:channel‘)  def listen_task(self):    for i in self.ps.listen():      if i[‘type‘] == ‘message‘:        print "Task get", i[‘data‘]if __name__ == ‘__main__‘:  print ‘listen task channel‘  Task().listen_task()

Flask Entrance
We implemented the back-end services for two asynchronous tasks, starting them directly and listening to the Redis queue or channel messages. The simple test is as follows:

?
1234567891011121314151617181920212223242526272829303132333435363738394041424344 import redisimport randomimport loggingfrom flask import Flask, redirect app = Flask(__name__)rcon = redis.StrictRedis(host=‘localhost‘, db=5)prodcons_queue = ‘task:prodcons:queue‘pubsub_channel = ‘task:pubsub:channel‘ @app.route(‘/‘)def index():  html = """<br><center><br><a href="/prodcons">生产消费者模式</a><br><br><a href="/pubsub">发布订阅者模式</a></center>"""  return html@app.route(‘/prodcons‘)def prodcons():  elem = random.randrange(10)  rcon.lpush(prodcons_queue, elem)  logging.info("lpush {} -- {}".format(prodcons_queue, elem))  return redirect(‘/‘)@app.route(‘/pubsub‘)def pubsub():  ps = rcon.pubsub()  ps.subscribe(pubsub_channel)  elem = random.randrange(10)  rcon.publish(pubsub_channel, elem)  return redirect(‘/‘)if __name__ == ‘__main__‘:  app.run(debug=True)

startup scripts, using

?
12 siege -c10 -r 5 http://127.0.0.1:5000/prodconssiege -c10 -r 5 http://127.0.0.1:5000/pubsub

You can see asynchronous messages separately in the script input of the listener. In asynchronous tasks, you can do some time-consuming operations, of course, currently these practices do not know the results of asynchronous execution, if you need to know the results of asynchronous execution, you can consider the design of the process task or use some tools such as RQ or celery.

Python's Flask Framework application method for calling Redis queue data

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.