Python Queue Module Queue usage detailed

Source: Internet
Author: User
Tags sleep


First, the initial knowledge of the queue module

The queue module implements multiple producers, multiple consumer queues. It is especially useful for multithreaded programs where information must be securely exchanged between multiple threads. The queue class in this module implements all the necessary lock semantics. It relies on the availability of the Python thread support; see the threading module.

The module implements three types of queues: FIFO (First-in-out, first-in-first-out, Default-queue), LIFO (last in first, LIFO), priority-based queues. The following are the common methods:

Advanced First Out q = Queue.queue (maxsize)
LIFO First out a = Queue.lifoqueue (maxsize)
Priority Queue.priorityqueue (MAXSIZE)
Queue.qsize () returns the size of the queue
Queue.empty () returns True if the queue is empty, false instead
Queue.full () returns True if the queue is full, false instead
Queue.full and maxsize size correspond
Queue.put (item) write queue, timeout wait time is not blocked
Queue.get ([block[, timeout]]) Gets the queue, timeout wait time
Queue.get_nowait () quite queue.get (False)
Queue.put_nowait (item) quite Queue.put (item, False)
Queue.task_done () After completing a work, the function sends a signal to the queue that the task has completed
Queue.join (): actually means wait until the queue is empty, and then do something else
A more detailed overview of the Python standard library's queue module is available.

Two, the queue shows the column

1, FIFO (advanced first Out)

Import Queue
Q = Queue.queue ()
For I in range (5):
Q.put (i)
While not Q.empty ():
Print Q.get ()
The output results are as follows:

[Root@361way queue]# python fifo.py

0
1
2
3
4

The output order is the same as the entry order.

2, LIFO (backward first out)

Import Queue
Q = Queue.lifoqueue ()
For I in range (5):
Q.put (i)
While not Q.empty ():
Print Q.get ()
The results of the implementation are as follows:

Import Queue
Q = Queue.lifoqueue ()
For I in range (5):
Q.put (i)
While not Q.empty ():
Print Q.get ()

3, the queue with priority

Import Queue
Class Job (object):
def __init__ (self, Priority, description):
Self.priority = Priority
Self.description = description
print ' New job: ', description
Return
def __cmp__ (self, Other):
Return CMP (self.priority, other.priority)
Q = Queue.priorityqueue ()
Q.put (Job (3, ' mid-level job '))
Q.put (Job, ' low-level job ')
Q.put (Job (1, ' Important job '))
While not Q.empty ():
Next_job = Q.get ()
print ' Processing job: ', next_job.description

The results of the implementation are as follows:

[Root@361way queue]# python queue_priority.py
New Job:mid-level Job
New Job:low-level Job
New job:important Job
Processing Job:important Job
Processing Job:mid-level Job
Processing Job:low-level Job
From the results above, it can be seen that the smaller the priority value setting, the more first execution. In addition, in the case of single-threaded, where multiple threads get () item at the same time in the multi-thread example, it is time to decide which task to execute first.

Third, queues and threads

The actual use of queues is combined with threads. Here are a few code examples of queues and threads:

From Queue Import *
From threading Import Thread
Import Sys
' This function would process the items in the queue, in serial '
DEF processor ():
While True:
If queue.empty () = = True:
Print "The Queue is empty!"
Sys.exit (1)
Try
Job = Queue.get ()
Print "I ' m operating on job item:%s"% (Job)
Queue.task_done ()
Except
Print "Failed to operate on job"
"' Set variables '"
Queue = Queue ()
Threads = 4
' A list of job items. You are would want this is more advanced,
Like reading from a file or database ""
Jobs = ["Job1", "Job2", "JOB3"]
"' iterate over jobs and put" into the "queue in sequence '"
#for Job in jobs:
For job in range (100):
print "Inserting job into the queue:%s"% (Job)
Queue.put (Job)
' Start some threads, each one would process one job from the ' queue '
#for i in range (100):
For I in range (threads):
th = Thread (target=processor)
Th.setdaemon (True)
Th.start ()
"' Wait until all jobs are processed before quitting '
Queue.join ()

Note that in the Processer function, the "while True:" line, if the line is not there, when the number of threads (thread) is less than the number of queues, the first round of loops will be stuck, do not follow the loop. So adding that line is equivalent to starting a dead loop, until all the queues end, the queue is empty, and the loop ends.

Example 2:

[Root@361way tmp]# python queue-example-1.py
Task 0 finished
Task 1 finished
Task 3 finished
Task 2 finished
Task 5 finished
Task 4 Finished
Task 6 finished
Task 7 finished
Task 9 finished
Task 8 finished
[Root@361way tmp]# more queue-example-1.py
# file:queue-example-1.py
Import threading
Import Queue
Import time, Random
Workers = 2
Class Worker (threading. Thread):
def __init__ (self, queue):
Self.__queue = Queue
Threading. Thread.__init__ (self)
def run (self):
While 1:
item = Self.__queue.get ()
If Item is None:
Break # reached end of queue
# Pretend we ' re doing something that takes 10-100 ms
Time.sleep (Random.randint (10, 100)/1000.0)
Print "Task", Item, "Finished"
#
# Try It
Queue = Queue.queue (0)
For I in range (workers):
Worker (queue). Start () # Start a worker
For I in range (10):
Queue.put (i)
For I in range (workers):
Queue.put (None) # Add End-of-queue Markers

The Python queue module has three types of queues:

1, the Python queue module FIFO queues advanced first out.
2, the LIFO resembles the heap. That is, advanced and out.
3, there is a priority queue level lower the more first out.

There are three constructors for each of these three queues:

1, Class Queue.queue (maxsize) FIFO
2, Class Queue.lifoqueue (maxsize) LIFO
3, Class Queue.priorityqueue (maxsize) Priority queue

Introduce the common methods in this package:

Queue.qsize () returns the size of the queue
Queue.empty () returns True if the queue is empty, false instead
Queue.full () returns True if the queue is full, false instead
Queue.full and maxsize size correspond
Queue.get ([block[, timeout]]) Gets the queue, timeout wait time
Queue.get_nowait () quite queue.get (False)
Non-blocking Queue.put (item) write queue, timeout wait time
Queue.put_nowait (item) quite Queue.put (item, False)
Queue.task_done () After completing a work, the Queue.task_done () function sends a signal to the queue that the task has completed
Queue.join () actually means wait until the queue is empty, and then do something else


Attach an example:

#coding: Utf-8

Import Queue
Import threading
Import time
Import Random

Q = queue.queue (0) #当有多个线程共享一个东西的时候就可以用它了
Num_workers = 3

Class Mythread (threading. Thread):

def __init__ (Self,input,worktype):
SELF._JOBQ = input
Self._work_type = Worktype
Threading. Thread.__init__ (self)

def run (self):
While True:
If Self._jobq.qsize () > 0:
Self._process_job (Self._jobq.get (), Self._work_type)
Else:break

def _process_job (self, Job, Worktype):
Dojob (Job,worktype)

def dojob (Job, Worktype):
Time.sleep (Random.random () * 3)
Print "Doing", Job, "Worktype", Worktype

if __name__ = = ' __main__ ':
Print "Begin ..."
For i inrange (num_workers * 2):
Q.put (i) #放入到任务队列中去
Print "Job Qsize:", Q.qsize ()

For x InRange (num_workers):
Mythread (q,x). Start ()

Thinking of multithreading under Python

Let's start with an example:


Import Queue,threading,time,random



Class Consumer (threading. Thread):
def __init__ (Self,que):
Threading. Thread.__init__ (self)
Self.daemon = False
Self.queue = que
def run (self):
While True:
If Self.queue.empty ():
Break
item = Self.queue.get ()
#processing the item
Time.sleep (item)
Print Self.name,item
Self.queue.task_done ()
Return
que = Queue.queue ()
For x in range (10):
Que.put (Random.random () *, True, None)
Consumers = [Consumer (que) for x in range (3)]

For C in consumers:
C.start ()
Que.join ()
Import Queue,threading,time,random

Class Consumer (threading. Thread):
def __init__ (Self,que):
Threading. Thread.__init__ (self)
Self.daemon = False
Self.queue = que
def run (self):
While True:
If Self.queue.empty ():
Break
item = Self.queue.get ()
#processing the item
Time.sleep (item)
Print Self.name,item
Self.queue.task_done ()
Return
que = Queue.queue ()
For x in range (10):
Que.put (Random.random () *, True, None)
Consumers = [Consumer (que) for x in range (3)]

For C in consumers:
C.start ()
Que.join ()

The function of the code is to produce 10 random numbers (0~10 range), the number and thread name after the corresponding time of sleep

In this code, it is the case of a fast producer (10 random numbers) and 3 slow-speed consumers.

In this case, let's start with three consumers and then block the main thread with Que.join ().

When three threads discover that the queue is empty, their run function returns and three threads end. At the same time the main thread blocking open, all programs end.


For slow-speed producers and fast consumers, the code is as follows:


Import Queue,threading,time,random


Class Consumer (threading. Thread):
def __init__ (Self,que):
Threading. Thread.__init__ (self)
Self.daemon = False
Self.queue = que
def run (self):
While True:
item = Self.queue.get ()
If item = None:
Break
#processing the item
Print Self.name,item
Self.queue.task_done ()
Self.queue.task_done ()
Return
que = Queue.queue ()

Consumers = [Consumer (que) for x in range (3)]
For C in consumers:
C.start ()
For x in range (10):
item = Random.random () * 10
Time.sleep (item)
Que.put (item, True, None)


Que.put (None)
Que.put (None)
Que.put (None)
Que.join ()
Import Queue,threading,time,random

Class Consumer (threading. Thread):
def __init__ (Self,que):
Threading. Thread.__init__ (self)
Self.daemon = False
Self.queue = que
def run (self):
While True:
item = Self.queue.get ()
If item = None:
Break
#processing the item
Print Self.name,item
Self.queue.task_done ()
Self.queue.task_done ()
Return
que = Queue.queue ()

Consumers = [Consumer (que) for x in range (3)]
For C in consumers:
C.start ()
For x in range (10):
item = Random.random () * 10
Time.sleep (item)
Que.put (item, True, None)


Que.put (None)
Que.put (None)
Que.put (None)
Que.join ()

In this case, the fast consumer needs to block on get (otherwise the thread will end up) so for the entire program to stop, the none tag is used, so that the child thread returns to the end when it encounters none.

Because the consumption speed is greater than the production speed, the child thread is run to wait for the queue to join the new element, and then the task is added slowly.

Note that the last put (none) three times, because each thread returns will take out a none, you have to do so that three threads all stop. Of course, there is a more simple and crude method, that is, the handle thread is set to Deamon, once the production is completed, start que.join () blocking until the queue is empty to end the main thread, although the child thread in the blocking wait queue will be forced off because of the Deamon attribute ....


This article gives 2 examples of multiple consumers of a single producer. References are <python reference manuals > third editions with single C and single p codes.

About multithreading functions such as put,join or something, or their own first reading to learn the concept of it ~

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.