Multithreading can be simply understood to perform multiple tasks at the same time. This article to you to share the Python multi-threading Threading Beginner Tutorial examples, interested friends learn together
1.1 What is multithreaded threading
Multithreading can be simply understood to perform multiple tasks at the same time.
Multi-process and multi-threaded can perform multiple tasks, and threads are part of the process. Threads are characterized by the ability to share memory and variables between threads, with less resource consumption (however, in a UNIX environment, multi-process and multi-threaded resource scheduling consumes no significant gap, Unix scheduling is faster), the disadvantage is that the synchronization between threads and lock is more troublesome.
1.2 Adding Threads Thread
Import Module
Import threading
Gets the number of threads that have been activated
Threading.active_count ()
View all thread information
Threading.enumerate ()
View the threads that are now running
Threading.current_thread ()
Add thread, threading.Thread()
receive parameter target represents the task to be completed by this thread, it needs to define itself
Def thread_job (): print (' This is a thread of%s '% Threading.current_thread ()) def main (): thread = Threading. Thread (Target=thread_job,) # defines threads Thread.Start () # Let the thread start working if __name__ = = ' __main__ ': Main ()
1.3 Join function
Because the thread is in parallel, using the Join feature allows the thread to complete before proceeding to the next step, blocking the calling thread until all the tasks in the queue are processed.
Import Threadingimport timedef thread_job (): print (' T1 start\n ') for I in range: time.sleep (0.1) Print (' T1 finish\n ') def t2_job (): Print ( ' T2 start\n ') print (' T2 finish\n ') def main (): added_thread= Threading. Thread (target=thread_job,name= ' T1 ') thread2=threading. Thread (target=t2_job,name= ' T2 ') Added_thread.start () #added_thread. Join () Thread2.start () # Thread2.join () print (' All done\n ') if __name__== ' __main__ ': Main ()
As shown above, when the join function is not used, the result is as follows:
After the join function is executed, T1 runs the T2 before running print (' all done ')
1.4 Storage Process Result queue
Queue is a thread-safe queuing (FIFO) implementation in the Python standard library, providing a first-in-one data structure for multithreaded programming, a queue that is used to transfer information between producers and consumer threads
(1) Basic FIFO queue
Class queue. Queue (maxsize=0)
MaxSize is an integer that indicates the upper limit of the number of data that can be stored in the queue, when the upper limit is reached, the insertion causes blocking until the data in the queue is consumed, and if maxsize is less than or equal to 0, there is no limit to the queue size
(2) LIFO queue last on first out LIFO
Class queue. Lifoqueue (maxsize=0)
(3) Priority queue
Class queue. Priorityqueue (maxsize=0)
The code in the video is not particularly clear.
Import threadingimport timefrom Queue import queuedef Job (L,Q): For I in range (Len (l)): l[i]=l[i]**2 q.put ( L) def multithreading (): q=queue () threads=[] data=[[1,2,3],[3,4,5],[4,5,6],[5,6,7]] for i in range (4): t=threading. Thread (target=job,args= (data[i],q)) T.start () threads.append (t) for the thread in threads: Thread.Join () results=[] for _ in range (4): results.append (Q.get ()) Print (results) if __name__= = ' __main__ ': multithreading ()
The result of the operation is as follows
1.5 GIL is not necessarily efficient.
Global interpreter lock, the execution of Python is controlled by the Python virtual machine (also the main interpreter loop), and the Gil Controls access to the Python virtual machine, ensuring that only one thread runs in the interpreter at any moment. In a multithreaded environment, a Python virtual machine executes in the following manner:
1. Set GIL
2. Switch to a thread to run
3. Run:
A. Specify the number of bytecode directives, or
B. Thread active Surrender control (can call Time.sleep (0))
4. Set the thread to sleep
5. Unlock Gil
6. Repeat 1-5
The Gil will be locked until the function ends (because no Python bytecode is running during this time, so the thread switch is not done) when calling external code, such as C + + extension functions.
Below is the example of the code in the video, the expansion of a number 4 times times, divided into normal mode, and allocated to 4 threads to do, and found that time is not too much difference between the magnitude.
Import threadingfrom Queue import Queueimport copyimport timedef Job (L, Q): res = SUM (l) Q.put (res) def Multithrea Ding (l): q = Queue () threads = [] for I in range (4): t = Threading. Thread (Target=job, args= (Copy.copy (L), q), name= ' t%i '% i) T.start () threads.append (t) [T.join () for T in Threads] total = 0 for _ in range (4): total + = Q.get () print (All) def normal (L): All = SUM (l) Print (total) if __name__ = = ' __main__ ': l = List (range (1000000)) s_t = Time.time () Normal (l*4) print (' Normal: ', Time.time ()-s_t) s_t = Time.time () multithreading (L) print (' Multithreading: ', Time.time ()-s_t)
The result of the operation is:
1.6-Wire Lock lock
If thread 1 Gets the result, and you want thread 2 to continue processing with 1 results, you need to 1lock, wait until 1 executes, and then start thread 2. In general, lock is used for share memory, which is the process of working with shared memories.
Import Threadingdef job1 (): global A, lock #全局变量 lock.acquire () #开始lock for i in range: A + = 1 pr Int (' Job1 ', A) lock.release () #释放def job2 (): global A, lock Lock.acquire () for I in range: A + = Ten print (' Job2 ', A) lock.release () if __name__ = = ' __main__ ': lock = Threading. Lock () A = 0 t1 = Threading. Thread (target=job1) t2 = Threading. Thread (TARGET=JOB2) T1.start () T2.start () t1.join () t2.join ()
The results of the operation are as follows:
Summarize