Python Multi-threaded concurrency Threading & task Queues Queue

Source: Internet
Author: User
Tags message queue thread class

Https://docs.python.org/3.7/library/concurrency.html
The Python program is single-threaded by default, which means that the statement after the previous statement cannot continue execution
Feel the thread first, in general:

def Testa ():    sleep (1)    print"a"def  Testb ():    sleep (1)    print"b"Testa () Testb ( )
#先隔出一秒打印出a, one second, B.

But if you use the threading:

TA = threading. Thread (target== Threading. Thread (target=testb) for in [TA,TB]:    t.start () for in [TA,TB]:    t.join ()print"done"#  The output is the result of AB or BA (clinging) and then an empty line. 

The result is that, after start, Ta starts running, but the main thread (the script itself) continues to start the next cycle without waiting for it to complete, and then the TB begins, and in the following period, TA and TB Two threads (two processes representing Testa and TESTB, respectively) are executed together. This is undoubtedly a significant increase in the speed of operation relative to one iteration.

The thread class is an abstract class of threads whose constructor's parameter target points to a function object, that is, the specific operation of the thread. You can also have args=<tuple> to pass parameters to the target function. It is important to note that when a sequence is passed in, thread automatically breaks it down into a single individual element and then breaks it down to the target function. I guess it must have been *args in the definition.

The Join method is a very tricky thing, so far it is not clear what this is. The join ([timeout]) method blocks the main thread until the child thread that called this method completes and the mainline friend continues to run down. (I was so confused that I put the join on the back of start and wrote, so many threads have no advantage in speed, just like a single thread =). As in the example above, a traversal would start all threads, and a traversal would seem like a more common practice to join all threads together.

About the thread lock

multithreaded programs involve a problem in which a thread lock is necessary when different threads are confused about the same resource for modification or exploitation.

You can create a simple thread lock by using the Thread.lock class. Lock = Threading. Lock (). Before a thread start, let Lock.acquire (), and lock after acquire () can no longer acquire, otherwise it will be an error. It's good to call Lock.release () to release the lock when the thread ends. In general, a locked multi-threaded scenario can improve some of the efficiency, but there will be a blocking wait at the time of writing the file. In contrast, no multithreaded scenarios can further improve efficiency, but may cause problems such as read-write conflicts, so use caution. Be sure to verify that there are no common resources between threads and then implement lock-free multithreading.

The way to wrap threads above is a process-oriented approach, and the following describes how to abstract threads in object-oriented manner

Object-oriented abstraction threads need to customize a class to inherit the thread class. For example, custom class MyThread (Thread). An instance of this class is represented by a thread, and then by overloading the Run method in this class (run, not start!! But the action of start is actually called run) to perform the specific operation. At this point, the lock can be used as a constructor parameter, passing a lock into a different instance to implement the thread lock control. Like what:

#method Two: inherit from thread and rewrite run ()classMyThread (Threading. Thread):def __init__(Self,arg): Super (MyThread, self).__init__()#Note: Be sure to call the initialization function of the parent class explicitly. self.arg=ArgdefRun (self):#define the functions to be run by each threadTime.sleep (1)        Print 'The arg is:%s\r'%Self.arg forIinchXrange (4): T=MyThread (i) T.start ()Print 'Main thread end!'

The thread class also has some of the following methods, and a custom class can also invoke the

GetName ()

SetName (...) In fact, the thread class has a name parameter in the constructor method, and it can take a moniker for the corresponding thread. These two methods are related to the name attribute.

IsAlive () A thread does not have an exception from start () to the end of Run (), which is actually alive.

Setdaemon (True/false) sets whether a thread is a daemon thread. When you set a thread as the daemon thread, the program does not wait for the thread to end and then exit the program, refer to http://blog.csdn.net/u012063703/article/details/51601579

In addition to the thread class, threading has some of the following properties, briefly:

The Timer class, the timer (Int,target=func), is similar to the thread class except that it starts the thread with the function specified by target after int seconds.

CurrentThread () Gets the current thread object

Activecount () Gets the total number of threads currently active

Enumerate () Get a list of all active threads

Settrace (func) sets a trace function to execute before run execution

Setprofile (func) sets a trace function to execute after the run has finished executing

Queue is used to establish and manipulate queues, often together with the threading class, to establish a simple thread queue.

First, there are many kinds of queues, classified according to the order of entry and exit, can be divided into

Queue.queue (maxsize) FIFO (advanced first-out queue)

Queue.lifoqueue (maxsize) LIFO (advanced back-out queue)

Queue.priorityqueue (maxsize) The lower the priority, the more first out

If the set MAXSIZE is less than 1, the length of the queue is infinitely long

FIFO is a commonly used queue, and some of its commonly used methods are:

Queue.qsize () returns the queue size

Queue.empty () Determine if the queue is empty

Queue.full () Determine if the queue is full

Queue.get ([block[,timeout]]) deletes from the queue header and returns a Item,block default is true, which means that when the queue is empty but goes to get, it blocks the thread and waits until there is an item to get out of the item. If False, the exception is thrown when the queue is empty and you go to get. The timeout parameter can be set again if the block is true. Indicates that a full exception is thrown when the queue is empty and get is blocked by the number of seconds specified by timeout.

Queue.put (...  [, Block[,timeout]]) Insert an item to the end of the queue and, if the block=true is full, block waiting for the empty space to Put,block=false when the exception is thrown. A timeout of timeout,put with get is a parameter that is set when the block is true.

Queue.task_done () from the scene, after processing a get-out item, call Task_done will send a signal to the queue that the task has been completed

Queue.join () monitors all item and blocks the main thread until all item has called Task_done and the mainline friend continues to execute downward. The advantage of this is that if a thread starts processing the last task, it takes the last task from the task queue, and the task queue is empty but the last thread has not finished processing. When a join is called, the main thread does not end up because the queue is empty, but instead waits for the last one to finish processing.

Combining threading and queue can build a simple producer-consumer model, such as:

ImportThreadingImportQueueImport TimeclassWorker (threading. Thread):def __init__(self,queue): Threading. Thread.__init__(self) self.queue=Queue Self.thread_stop=FalsedefRun (self): while  notSelf.thread_stop:Print("thread%d%s:waiting for tast"%(self.ident,self.name))Try: Task=q.get (Block=true, timeout=20)#Receiving Messages       exceptQueue.empty:Print("Nothing to do!i'll go home!") Self.thread_stop=True Break         Print("task recv:%s, Task no:%d"% (task[0],task[1]))         Print("I am working") Time.sleep (3)         Print("Work finished!") Q.task_done ()#Complete a taskRes=q.qsize ()#determine message Queue size       ifRes>0:Print("fuck! There is still%d tasks to do"%(res))defStop (self): Self.thread_stop=Trueif __name__=="__main__": Q=queue.queue (3) worker=worker (Q) Worker.start () q.put (["Produce one cup!", 1], block=true, Timeout=none)#Generating Task MessagesQ.put (["Produce one desk!", 2], block=true, timeout=None) q.put (["Produce one apple!", 3], block=true, timeout=None) q.put (["Produce one banana!", 4], block=true, timeout=None) q.put (["Produce one bag!", 5], block=true, timeout=None)Print("***************leader:wait for finish!") Q.join ()#wait for all tasks to complete     Print("***************leader:all Task finished!")

The output is like this

thread139958685849344 Thread-1: Waiting forTast 1task Recv:produce One cup!, task No:1I am working work finished! fuck! There is still3tasks to do thread139958685849344 Thread-1:waiting forTast 1task Recv:produce One desk!, task No:2I am workingleader:wait forfinish!    Work finished! fuck! There is still3tasks to do thread139958685849344 Thread-1:waiting forTast 1task Recv:produce One apple!, task No:3I am working work finished! fuck! There is still2tasks to do thread139958685849344 Thread-1:waiting forTast 1task Recv:produce One banana!, task No:4I am working work finished! fuck! There is still1tasks to do thread139958685849344 Thread-1:waiting forTast 1task Recv:produce One bag!, task No:5I am working work finished! thread139958685849344 Thread-1:waiting forTast 1 ***************Leader:all Task finished! Nothing to Do!i'll go home!

In the above example there is no performance boost (after all, there is only one thread running). The meaning of the thread queue is not to further improve the efficiency of the operation, but to make the concurrency of the threads more organized. As you can see, after you increase the thread queue, the program has control over the number of concurrent threads. The new thread wants to join the queue to begin execution, and must wait until one of the existing threads is complete. As an example,

 for inch Range (x):   = MyThread (queue)  T.start ()

X is a variable here, and we don't know how many threads the loop will trigger, and if so, it will be risky. But with the queue, a queue is used as a parameter when all threads build the thread object, so that the thread must be executed according to the size specified by the queue, without worrying about the dangers of too many threads.



Python Multi-threaded concurrency Threading & task Queues Queue

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.