one, the threads in Python use:
There are two ways to use threads in Python: A function or a class to wrap a thread object.
1. Function: Call the Start_new_thread () function in the thread module to generate a new thread. The following example:
Copy CodeThe code is as follows:
Import time
Import Thread
def timer (no, interval):
CNT = 0
While cnt<10:
print ' Thread: (%d) time:%s\n '% (No, time.ctime ())
Time.sleep (interval)
Cnt+=1
Thread.exit_thread ()
def test (): #Use thread.start_new_thread () to create 2 new threads
Thread.start_new_thread (timer, ())
Thread.start_new_thread (Timer, (2,2))
If __name__== ' __main__ ':
Test ()
The above example defines a thread function timer, which prints out 10 time records and exits, each time the print interval is determined by the interval parameter. The first argument to Thread.start_new_thread (function, args[, Kwargs]) is a thread function (the timer method in this case), and the second parameter is the argument passed to the thread function, which must be a tuple type. Kwargs is an optional parameter.
The end of the thread can wait for the thread to end naturally, or you can call the Thread.exit () or Thread.exit_thread () method in the inline function.
2, create threading. The child class of thread wraps a thread object, as in the following example:
Copy CodeThe code is as follows:
Import threading
Import time
Class Timer (threading. Thread): #The Timer class is derived from the class threading. Thread
def __init__ (self, num, interval):
Threading. Thread.__init__ (self)
Self.thread_num = num
Self.interval = Interval
Self.thread_stop = False
def run (self): #Overwrite run () method, put what want the thread does here
While not self.thread_stop:
print ' Thread Object (%d), time:%s\n '% (Self.thread_num, time.ctime ())
Time.sleep (Self.interval)
def stop (self):
Self.thread_stop = True
def test ():
Thread1 = Timer (1, 1)
Thread2 = Timer (2, 2)
Thread1.start ()
Thread2.start ()
Time.sleep (10)
Thread1.stop ()
Thread2.stop ()
Return
if __name__ = = ' __main__ ':
Test ()
Personally, I prefer the second way of creating my own threading class, overriding threading if necessary. Thread class methods, the control of threads can be customized by themselves.
Threading. Use of the thread class:
1, call threading in the __init__ of your own thread class. Thread.__init__ (self, name = ThreadName)
ThreadName is the name of the thread
2, run (), usually requires rewriting, writing code to implement the required functionality.
3,getname (), get the Thread object name
4,setname (), setting the thread object name
5,start (), starting thread
6,jion ([timeout]), waiting for another thread to finish before running.
7,setdaemon (BOOL), which sets whether the child thread ends with the main thread, must be called before start (). The default is False.
8,isdaemon () to determine whether the thread is terminated with the main threads.
9,isalive () to check if the thread is running.
In addition, the threading module itself provides a number of methods and other classes that can help us better use and manage threads. You can see http://www.python.org/doc/2.5.2/lib/module-threading.html.
Assuming that two thread objects, T1 and T2, are going to increment by 1 for num=0, T1 and T2 will each be modified 10 times for NUM, and the final result of Num should be 20. However, due to multi-threaded access, it is possible that the following situation occurs: When num=0, T1 gets num=0. The system at this time T1 dispatch to "sleeping" state, the T2 conversion to "running" state, T2 page to get num=0. T2 then adds 1 to the resulting value and assigns it to NUM, making num=1. Then the system T2 to "sleeping", the T1 to "running". The thread T1 and assigns it to Num after the 0 plus 1 that it received earlier. In this way, obviously T1 and T2 have completed 1 times plus 1 work, but the result is still num=1.
The case above describes one of the most common problems in multithreaded situations: data sharing. When multiple threads are going to modify a shared data, we need to synchronize the data access.
1. Simple Synchronization
The simplest synchronization mechanism is the "lock". The lock object is made by threading. The Rlock class is created. The thread can use the lock's acquire () method to obtain the lock so that the lock enters the "locked" state. Only one thread can get a lock at a time. If another thread attempts to acquire the lock, it will be changed to the "blocked" state until the lock's release () method is called by the locked thread to free the lock, and the lock will enter the "unlocked" state. A thread in the "blocked" state receives a notification and has the right to obtain a lock. If multiple threads are in the "blocked" state, all threads will first de-blocked the state, then the system chooses one thread to obtain the lock, and the other threads continue to silence ("blocked").
The thread module and the lock object in Python are low-level programming tools provided by Python and are very simple to use. As shown in the following example:
Copy CodeThe code is as follows:
Import Thread
Import time
Mylock = Thread.allocate_lock () #Allocate a lock
Num=0 #Shared Resource
def add_num (name):
Global num
While True:
Mylock.acquire () #Get the lock
# do something to the shared resource
print ' Thread%s locked! num=%s '% (name,str (num))
If Num >= 5:
print ' Thread%s released! num=%s '% (name,str (num))
Mylock.release ()
Thread.exit_thread ()
Num+=1
print ' Thread%s released! num=%s '% (name,str (num))
Mylock.release () #Release the lock.
def test ():
Thread.start_new_thread (Add_num, (' A ',))
Thread.start_new_thread (Add_num, (' B ',))
If __name__== ' __main__ ':
Test ()
Python also provides a high-level thread-control library based on thread, which is the previously mentioned threading. Python's threading module is a module built on the thread module, exposing many of the attributes in the thread module in the threading module. In the thread module, Python provides a user-level thread synchronization Tool "Lock" object. In the threading module, Python provides a variant of the lock object: the Rlock object. A lock object is maintained inside the Rlock object, which is a Reentrant object. For a lock object, if a thread acquire the operation two times in a row, the second acquire will suspend the thread because there is no release after the first acquire. This causes the lock object to never release, causing the thread to deadlock. The Rlock object allows a thread to acquire its operations multiple times, because the number of threads acquire is maintained internally through a counter variable. And each time the acquire operation must have a release operation corresponding to it, after all the release operation is completed, the other thread can request the Rlock object.
Here's a look at how to use threading's Rlock object for synchronization.
Copy CodeThe code is as follows:
Import threading
Mylock = Threading. Rlock ()
Num=0
Class MyThread (threading. Thread):
def __init__ (self, name):
Threading. Thread.__init__ (self)
Self.t_name = Name
def run (self):
Global num
While True:
Mylock.acquire ()
print ' \nthread (%s) locked, Number:%d '% (self.t_name, num)
If num>=4:
Mylock.release ()
print ' \nthread (%s) released, Number:%d '% (self.t_name, num)
Break
Num+=1
print ' \nthread (%s) released, Number:%d '% (self.t_name, num)
Mylock.release ()
def test ():
Thread1 = MyThread (' A ')
Thread2 = MyThread (' B ')
Thread1.start ()
Thread2.start ()
If __name__== ' __main__ ':
Test ()
We make the code that modifies the shared data a "critical section." All critical sections must be enclosed between acquire and release of the same lock object.
2. Condition Synchronization
Locks can only provide the most basic synchronization. If you access a "critical section" only when certain events occur, you need to use the condition variable condition.
The condition object is a wrapper on the lock object, and when the condition object is created, its constructor requires a lock object as a parameter, and if there is no lock object argument, condition creates a Rlock object internally. On condition objects, it is also possible to invoke the acquire and release operations, because the internal lock object natively supports these operations. But the value of condition lies in the semantics of the wait and notify they provide.
How does a conditional variable work? After a thread successfully obtains a condition variable, the Wait () method that invokes the condition variable causes the thread to release the lock and enter the "blocked" state until another thread invokes the Notify () method of the same condition variable to wake the thread that enters the "blocked" state. If the Notifyall () method of the conditional variable is called, all the waiting threads are awakened.
A deadlock can occur if the program or thread is always in the "blocked" state. Therefore, if the use of lock, condition variables, such as synchronization mechanism, must pay attention to carefully check to prevent the occurrence of deadlock situation. For critical sections that can produce exceptions, use the finally clause in the exception handling mechanism to guarantee the release of the lock. A thread that waits for a condition variable must be explicitly awakened with the Notify () method, or it will be silent forever. Make sure that every wait () method call has a corresponding notify () call, and of course you can call the Notifyall () method just in case.
The problem of producer and consumer is a typical synchronization problem. Here is a brief introduction to two different implementation methods.
1, condition variable
Copy CodeThe code is as follows:
Import threading
Import time
Class Producer (threading. Thread):
def __init__ (self, t_name):
Threading. Thread.__init__ (self, name=t_name)
def run (self):
Global X
Con.acquire ()
If x > 0:
Con.wait ()
Else
For I in range (5):
X=x+1
Print "Producing ..." + str (x)
Con.notify ()
Print X
Con.release ()
Class Consumer (threading. Thread):
def __init__ (self, t_name):
Threading. Thread.__init__ (self, name=t_name)
def run (self):
Global X
Con.acquire ()
if x = = 0:
print ' Consumer wait1 '
Con.wait ()
Else
For I in range (5):
X=x-1
Print "Consuming ..." + str (x)
Con.notify ()
Print X
Con.release ()
Con = threading. Condition ()
X=0
print ' Start consumer '
C=consumer (' Consumer ')
print ' Start producer '
P=producer (' Producer ')
P.start ()
C.start ()
P.join ()
C.join ()
Print X
In the above example, in the initial state, the consumer is in the wait state, producer continuous production (1 operations to x) after 5 times, notify is waiting for the consumer. Consumer is awakened start consumption (minus 1 action on X)
2, synchronization queue
The queue object in Python also provides support for thread synchronization. Using a Queue object enables multiple producers and multiple consumers to form FIFO queues.
The producer stores the data sequentially, and the consumer pulls the data from the queue in turn.
Copy CodeThe code is as follows:
# Producer_consumer_queue
From queue import queue
Import Random
Import threading
Import time
#Producer thread
Class Producer (threading. Thread):
def __init__ (self, T_name, queue):
Threading. Thread.__init__ (self, name=t_name)
Self.data=queue
def run (self):
For I in range (5):
Print "%s:%s is producing%d to the queue!\n"% (Time.ctime (), Self.getname (), i)
Self.data.put (i)
Time.sleep (Random.randrange (10)/5)
Print "%s:%s finished!"% (Time.ctime (), Self.getname ())
#Consumer thread
Class Consumer (threading. Thread):
def __init__ (self, T_name, queue):
Threading. Thread.__init__ (self, name=t_name)
Self.data=queue
def run (self):
For I in range (5):
val = Self.data.get ()
Print "%s:%s is consuming. %d in the queue is consumed!\n "% (Time.ctime (), Self.getname (), Val)
Time.sleep (Random.randrange (10))
Print "%s:%s finished!"% (Time.ctime (), Self.getname ())
#Main thread
def main ():
Queue = Queue ()
Producer = producer (' Pro. ', queue)
consumer = consumer (' Con. ', queue)
Producer.start ()
Consumer.start ()
Producer.join ()
Consumer.join ()
print ' All Threads terminate! '
if __name__ = = ' __main__ ':
Main ()
In the example above, producer produces a "product" in a random time and puts it into the queue. Consumer found a "product" in the queue and went to consume it. In this case, because producer production faster than the speed of consumer consumption, so often producer production of several "products", consumer only consume a product.
The queue module implements a FIFO queue that supports multiple producer and multiple consumer. Queue is useful when shared information needs to be securely exchanged between multiple threads. The default length of a queue is infinite, but you can set its length by setting the MAXSIZE parameter of its constructor. The put method of the queue is inserted at the end of the team, and the prototype of the method is:
Put (item[, block[, timeout])
If the optional parameter block is true and timeout is none (the default), the thread is blocked until the queue is empty out of a data unit. If timeout is greater than 0, there is still no data unit available in the timeout period, and full exception is thrown. Conversely, if the block parameter is false (ignoring the timeout parameter), item is immediately added to the free data unit, and if there is no free data unit, the full exception is thrown.
The Get method for a queue is to fetch data from the first team, with the same parameters as the Put method. If the block parameter is true and timeout is none (the default), the thread is block until there is data in the queue. If timeout is greater than 0, there is still no desirable data within timeout time, Empty exception is thrown. Conversely, if the block parameter is false (ignoring the timeout parameter), the data in the queue is immediately fetched. If there is no data at this time, Empty exception will also be thrown.