Queue, production consumption model. html

Source: Internet
Author: User





A. Python queue:
    • Overview:
      • Import queue
      • is created in Python memory and the Python process exits, the queue is emptied
      • Ways to view these queue classes through Pycharm
    • FIFO queue:
      • Q = Queun. Queue ([10]) # Create a "FIFO" queue, accept up to 10 data, put the 11th card master, no parameter defaults to 0, that is, an unlimited number of
      • Q.qsize () # View the number of elements in the current queue
      • Q.maxsize # Maximum number of queues
      • Q.put ([timeout=2]) # wait 2 seconds, if there is no vacant position after 2 seconds error
      • Q.put ([Block=false]) # Default is blocked, and always wait for the card master, if False there is no location when the direct error
      • Q.get () # Default Fetch data, if there are no elements in the queue, it will block the card master
      • Q.get ([Block=false]) # does not block,
      • Q.get (timeout=2) # timeout time
      • Q.join () # If the tasks in the queue (get and insert are tasks) are not completed (not empty, and the counter is zeroed), the main process does not end, wait
      • Q.task_done () # This code is often written after each queue method executes, indicating that the method has completed and the counter is zeroed
    • Last-in-first-out queue (stack), which inherits the queue, can use the methods in its parent class:
      • Q = queue. Lifoqueue () # Last on first out
    • Priority queue (Inherits queue):
      • Q = queue. Priorityqueue ()
      • Q.put (One, ' Alex ') # When the element is placed, a priority is attached, and the queue is reordered in priority order
    • Bidirectional queue
      • Q = Queue.deque ()
Ii. Producer, consumption model (queue)
Import Queue
Import
Time
q = queue. Queue (20) # Create a FIFO queue
 
# "producer" each chef has been creating buns until the queue is full of blocking,
def productor (ARG):
While
True:
q.put (str (ARG) + ': Bun ') # Create a bun into the queue, cook number: Bun
# "consumer" every customer has been consuming buns until the queue is empty and blocked
def Consumer (ARG):
While
True:
Print (ARG, q.get ()) # take out a bun from the queue, which chef: Bun
Time.sleep (2)

For
I in range (3): # Create 3 chefs
t = Threading. Thread (Target=productor, args= (i,))
T.start ()

For
J in Range (20): # 20 people created
t = Threading. Thread (Target=consumer, args= (,))
T.start ()

    • Use Queue benefits:
      • Conclusion:
        • Using queues, you can increase the concurrency of the client so that client requests are not acceptable (denial of service)
        • To really improve processing power, you have to add back-end worker threads
      • Explain:
        • Scenario: Server concurrency is assumed to be 4, and the client is a dynamic request (insert INTO.) ), each INSERT statement executes an event if it is 20s
        • Do not use queues:
        • Process:
          • The server receives a request, immediately takes a worker thread to process it, and remains connected to the client during processing (the connection is in a pending state)
        • Concurrency capability: 20s, up to 4 concurrent
        • Request Processing: 20s,4 worker threads can handle 4 requests, 1 worker threads can handle only one (incoming request, no one is processed and no egg is used)
        • Using queues
        • Process:
          1. When the server receives a user request, it is dropped directly into the queue and then disconnected from the client.
          2. Back-end working threads (producers) are constantly fetching requests from the queue for processing
          3. When client queries the result of the last request execution again,
            • If the request is still in the queue, the return queue location "Request is processing, you have n people in front of you"
            • If the request is not in the queue, the server needs only select down DB to return (much faster than insert)
        • Concurrency: Unlimited because you can disconnect after you drop in a queue, that is, there are always 4 connections available
        • Request processing: Within 20s, 4 worker threads can handle 4 requests and 1 worker threads can handle only 1 requests


Queue, production consumption model. html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.