Python Route 45-multithreading

Source: Internet
Author: User

Threading Module

Threads have two ways of calling

Call directly

Import threadingimport timedef sayhi (num): # Defines the function to be executed by each thread print ("Running on number:%s"% num) time.sleep (2) if __ name__ = = "__main__": T1 = Threading. Thread (Target=sayhi, args= (1,)) # generates a thread instance t2 = Threading.     Thread (Target=sayhi, args= (2,)) # generates a thread instance T1.start () # Boot Thread t2.start () # Boot thread print (T1.getname ()) # Get thread name Print (T2.getname ())

An inherited invocation

Import Threadingimport Timeclass MyThread (threading. Thread): Def __init__ (self, num): Super ("MyThread, Self"). __init__ () self.num = num def run (self): # definition function to run per thread print ("Running on number:%s"% self.num) Time.sleep (2) if __name__ = = "__main__": T1 = MyThread ("1") t2 = MyThread ("2") T1.start () T2.start ()


Concurrent multithreaded Examples

First look at a piece of code, start 10 threads, each thread will sleep2 seconds, I want to calculate the execution time, is expected to be about 2 seconds

Import Threadingimport Timedef Run (n): Print ("task", N, Threading.current_thread ()) Time.sleep (2) start_time = time.t IME () # Start time for I in range: T = Threading. Thread (Target=run, args= (i,)) T.start () Use_time = Time.time ()-Start_timeprint ("Use Time:", Use_time)

But after execution found that the program did not wait for me so the thread has finished printing the execution time, this is not the result I want, because the main thread of the execution, not waiting for the child thread to complete, here is not blocked, here is the introduction of Join,join is waiting for the child thread execution

Import Threadingimport Timedef Run (n): Print ("task", N, Threading.current_thread ()) Time.sleep (2) start_time = time.t IME () t_list = []for i in range]: T = Threading. Thread (Target=run, args= (i,)) T.start () t_list.append (T) for R in T_list:r.join () Use_time = Time.time ()-start_t Imeprint ("Use Time:", Use_time)

This way, the main thread will be able to wait so the threads are executed to finish the print execution time, the result is no problem, use time:2.0050017833709717


Daemon Threads

The main thread ends, and the daemon does whatever it wants to do, and then it ends.

Import Threadingimport Timedef Run (n): Print ("task", N, Threading.current_thread ()) Time.sleep (2) start_time = time.t IME () for I in range: T = Threading. Thread (Target=run, args= (i,)) T.setdaemon (True) # Set as Daemon thread T.start () Use_time = Time.time ()-Start_timeprint ("Use t IME: ", Use_time)

This thread is set by Setdaemon (True) as the daemon thread, the code execution effect is the main thread prints out the usage time program and exits without sleep


Thread Lock (Mutex)

A process can start multiple threads, and multiple threads share a piece of memory space, which means that each thread can access the same data, and if two threads have to modify the same data at the same time, there must be a problem.

In order to avoid this problem, we can "add lock", that is, the thread lock, at the same time only one thread to modify the data

Import Threadingdef Run (n): Lock.acquire () # lock global num num + = 1 print (n) lock.release () # release locks lock = t Hreading. Lock () # instantiates a mutex num = 0for i in range (£): T = Threading. Thread (Target=run, args= ("t-%s"% i,)) T.start () print ("num:", num)


Recursive lock

There's a lock in a big lock.

IMPORT THREADINGNUM1, NUM2 = 0, 0DEF RUN1 ():     print ("grab  the first part data ")     lock.acquire ()     global  num1    num1 += 1    lock.release ()      return num1def run2 ():     print ("grab the second part  Data ")     lock.acquire ()     global num2     Num2 += 1    lock.release ()     return num2def run3 ():     lock.acquire ()     res1 = run1 ()      print ("-----between run1 and run2-----")     res2 = run2 ()     lock.release ()     print ("res1:%s,res2:%s"  %  (res1,  Res2)) LOCK =&NBSp;threading. Rlock () For i in range ():     t = threading. Thread (TARGET=RUN3)     t.start () While threading.active_count ()  != 1:     passelse:    print ("-----all threads done-----")      print (NUM1, NUM2)


Semaphore (semaphore)

mutexes allow one thread to change data simultaneously, while semaphores allow a certain number of threads to change data, such as 3 pits in the toilet, allowing up to 3 people to go to the toilet, and the person behind can only wait for someone to come out to the bathroom.

Import Threadingimport Timedef Run (n): Semaphore.acquire () time.sleep (2) print ("Run the Thread:%s"% n) semap Hore.release () if __name__ = = "__main__": semaphore = Threading. Boundedsemaphore (3) # run up to 3 threads at a time for I in range: T = Threading. Thread (Target=run, args= (i,)) T.start () while Threading.active_count ()! = 1:pass Else:print (" -----All Threads done-----")


Event Events

Events enable two or more processes to interact with each other

Import threadingimport timeimport randomdef light ():     if not  event.isset ():         event.set ()      #   Set a flag that means pass     count = 0    while True:         if count < 10:             print (' \033[42;1m--green light on---\033[0m ')          elif count < 13:             print (' \033[43;1m--yellow light on---\033[0m ')          elif count < 20:             if event.isset ():                &nBsp;event.clear ()    #  clear flag, means no pass              print (' \033[41;1m--red light on---\033[0m ')          else:            count = 0             event.set ()          time.sleep (1)         count += 1def  car (n):    while true:         Time.sleep (Random.randrange (Ten))         if event.isset ():             print ("Car [%s] is running ..."  % n)         else:             priNT ("Car [%s] is waiting for the red light."  % n) if __name__ ==  "__main__":     event = threading. Event ()     light = threading. Thread (Target=light)     light.start ()     for i in range (3):         car = threading. Thread (target=car, args= (i,))         car.start ()


Queue queues

Program decoupling

Speed up execution

There are several modes of queue

Queue. Queue (maxsize=0) # First in, first out

Queue. Lifoqueue (maxsize=0) # after-in first-out

Queue. Priorityqueue (maxsize=0) # Queues that can be prioritized when storing data


First-in, first-out examples, taken to the

Import Queueq = queue. Queue () q.put (1) q.put (2) q.put (3) print (Q.get ()) print (Q.get ()) print (Q.get ())


After-in first-out example, take to 3,2,1

Import Queueq = queue. Lifoqueue () q.put (1) q.put (2) q.put (3) print (Q.get ()) print (Q.get ()) print (Q.get ())


Examples of priority levels

Import Queueq = queue. Priorityqueue () Q.put ((2, "Jack")) Q.put ((3, "Tom")) Q.put ((1, "Jiachen")) print (Q.get ()) print (Q.get ()) print (Q.get ())

Q.qsize () has several data in the queue

Q.empty () Determines whether the queue is empty, NULL is true, and non-null is False

Q.full () Determines whether the queue is full, full is true, and is dissatisfied with false

Q.put (Item,block=true,timeout=none) to put data in the queue

Q.put_nowait (item) puts data into the queue, and if the queue is full, do not block the direct report exception

Q.get (Item,block=true,timeout=none) takes out the data in the queue, block is True if no data can be fetched on the block, false if no data can be fetched on the report exception, timeout is the time-out period, out-of-time report exception

Q.get_nowait (item) takes out the data in the queue and does not block the direct report exception if there is no data

Q.task_done ()

Q.join () blocked until the queue is consumed


Producer Consumer Model

A simple example

Import queueimport threadingimport timedef producer (name):     count  = 1    while true:        q.put ( Count)         print ("[%s] produced bone [%s]"  %  (Name, count))         count += 1         time.sleep (0.5) Def consumer (name):    while true:         q.get ()         print ("[%s] eats a bone"  %  Name)         time.sleep (1) if __name__ ==  "__main__":     q = queue. Queue (5)     p = threading. Thread (target=producer, args= ("xxx",))     c1 = threading. Thread (target=consumer,args= ("Tom",)) &NBSP;&NBSP;&NBsp; c2 = threading. Thread (target=consumer,args= ("Jack",))     p.start ()     c1.start ()     c2.start ()


This article is from the "Eight Miles" blog, so be sure to keep this source http://5921271.blog.51cto.com/5911271/1907305

Python Route 45-multithreading

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.