Python multithreaded Programming 1

Source: Internet
Author: User
Tags lock queue
Multithreaded programming must understand some of the basic concepts that apply to all programming languages. Content:

Concurrent programming

Multi-tasking operating system

Multithreading vs Multi-process

Thread Safety

The life cycle of a thread

Types of threads

Concurrent programming

Different programming paradigms have different perspectives on software. Concurrent programming sees software as a combination of tasks and resources-competing between tasks and sharing resources, and performing tasks when resources are met, otherwise waiting for resources.

Concurrent programming makes software easy to understand and reuse, and can greatly improve performance in some scenarios.

Multi-tasking operating system

To implement concurrency, you first need the support of the operating system. Most of today's operating systems are multi-tasking, and can perform multiple tasks "at the same time."

Multitasking can be performed at the level of a process or thread.

A process is an application that runs in memory, and each process has its own separate piece of memory space. Multitasking operating systems can execute these processes "concurrently".

A thread is a block of code that executes in a process, and multiple threads can run "simultaneously", so that multiple threads are considered "concurrent". The purpose of multithreading is to maximize the utilization of CPU resources. For example, in a JVM process, all the program code is run as a thread.

This "simultaneous", "concurrency" is just a macro feeling, actually from the micro-level view is only the process/thread rotation execution, but the switch time is very short, so produced a "parallel" feeling.

Multithreading vs Multi-process

The operating system allocates different blocks of memory for each process, while multiple threads share the memory blocks of the process. The most straightforward difference is that the overhead of creating a thread is much smaller than the cost of creating the process.

At the same time, because of different memory blocks, communication between processes is relatively difficult. Requires the use of pipe/named pipe,signal, message queue, shared Memory,socket, and so on, the communication between threads is simple and fast, which is to share the global variables within the process.

However, the scheduling of the process is the responsibility of the operating system, the scheduling of the thread needs our own to consider, to avoid deadlocks, starvation, live lock, resource depletion and other situations, which will increase a certain degree of complexity. Also, because of shared memory between threads, we also need to consider thread safety issues.

Thread Safety

The global variables in the process are shared between threads, so when other threads change shared variables, this thread can have an impact. The so-called thread-safe constraint is when a function is called repeatedly by multiple concurrent threads, and always produces the correct result. To ensure thread safety, the key is to ensure the correct access to shared variables by locking.

A more restrictive constraint than thread safety is "reentrant", that is, the function is paused during execution within one thread, then is called within another thread, and then resumes execution after returning the original thread. It is guaranteed to execute correctly throughout the process. Guaranteed reentrant, usually by making local copies of global variables.

The life cycle of a thread

The so-called XX life cycle, in fact, is an object containing the production and destruction of a state diagram. The life cycle of the thread is as follows:

The descriptions of each state are as follows:

New. After the newly created thread has been initialized, it enters the runnable state.

Runnable ready. Waits for the thread to dispatch. After dispatch, enter the running state.

Running run.

Blocked blocked. Pauses the operation, after unblocking, enters the runnable state waits for the dispatch again.

Dead dies. The thread method finishes executing or terminating abnormally.

There may be 3 cases from running into blocked:

Synchronization: Gets the synchronization lock in the thread, but when the resource is locked by another thread, it enters the locked state until the resource is available (the order in which it gets is controlled by the lock queue)

Sleep: After a thread runs the sleep () or join () method, the thread enters the sleeping state. The difference is that sleep waits for a fixed time, and the join waits for the child thread to finish executing. Of course join can also specify a "time-out". Semantically, if two threads A, B, call B.join () in a, the equivalent of merging (join) into one thread. The most common scenario is to join all the child threads in the main thread.

Wait: After the Wait () method is executed in the thread, the thread enters the waiting state and waits for another thread to be notified (notify).

Types of threads

Main thread: When a program starts, a process is created by the operating system (OS), while a thread runs immediately, which is often called the program's main thread (main thread). Each process has at least one main thread, and the main thread is usually last closed.

Child Threads: Other threads created in the program, which are the child threads of the main thread, relative to the main thread.

Daemon Thread: Daemon thread, an identification of threads. The daemon thread provides services to other threads, such as the garbage collection thread of the JVM. When all that is left is the daemon thread, the process exits.

Foreground thread: Other threads that are relative to the daemon thread are called foreground threads.


Python Support for multithreading

Virtual machine level

The Python virtual machine uses the Gil (Global interpreter lock, the Universal interpreter lock) to mutually exclusive threads access to shared resources, temporarily unable to take advantage of multiprocessor benefits.

Language level

At the language level, Python provides good support for multithreading, including: Thread,threading,queue, a multithreaded-related module in Python. It is easy to support the creation of threads, mutexes, semaphores, synchronization and other features.

Thread: The underlying support module for multithreading is generally not recommended for use.

Threading: Thread is encapsulated, and some threading operations are instantiated, providing the following classes:

Thread Threading Class

The timer is similar to thread, but waits a while before it starts to run

Lock the original language

Rlock locks can be re-entered. Enables a single thread to obtain a lock already acquired again

Condition A conditional variable that allows a thread to stop waiting for another thread to satisfy a "condition"

The condition variable that is common to the Event. Multiple threads can wait for an event to occur, and all threads are activated after the event occurs

Semaphore provides a "waiting room" structure for the thread that waits for the lock

Boundedsemaphore similar to semaphore but not allowed to exceed the initial value

Queue: Implements multi-producer (Producer), multi-Consumer (Consumer) queues, supports lock primitives, and provides good synchronization support across multiple threads. The classes provided:

Queue queues

Lifoqueue after in first out (LIFO) queue

Priorityqueue Priority Queue

Where the thread class is your primary thread class, you can create a process instance. The functions provided by this class include:

GetName (self) Returns the name of the thread

The IsAlive (self) Boolean flag that indicates whether the thread is still running

Isdaemon (self) Returns the daemon flag for the thread

The Join (self, timeout=none) program hangs until the thread ends and, if a timeout is given, blocks timeout seconds

Run (self) defines the function function of the thread

Setdaemon (self, daemonic) sets the thread's daemon flag to Daemonic

SetName (self, name) sets the name of the thread

Start (self) starts thread execution

Third-party support

If you are particularly concerned about performance, you can also consider some of the "micro-threading" implementations:

An enhanced version of Stackless Python:python that provides support for micro-threading. Micro-Threading is a lightweight thread that takes more time to switch between multiple threads and consumes less resources.

Greenlet: is a by-product of stackless, which refers to micro-threading as "Tasklet". Tasklet runs in pseudo-concurrency, using channel for synchronous data exchange. While "Greenlet" is a more primitive concept of micro-threading, there is no dispatch. You can construct a micro-threading scheduler yourself, or you can use Greenlet to implement advanced control flows.

In the next section, you will start creating and starting threads in Python.

  • Related Article

    Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.