Multi-threaded programming requires additional attention during programming. For most tasks, you can reduce complexity by queuing execution requests in the thread pool thread mode. This topic will explore more complex situations, such as coordinating the work of multiple threads or processing the blocked threads.
Deadlock and contention Conditions
Multi-threaded programming solves the throughput and responsiveness issues, but introducing this feature brings about new problems: deadlocks and contention conditions.
Deadlock
When every thread in two threads tries to lock the resources that the other thread has locked, a deadlock will occur. No thread can continue execution.
Many methods of the managed thread processing class provide timeout settings to help you detect deadlocks. For example, the following code tries to get the lock on the current instance. If the lock fails within 300 milliseconds, monitor: tryenter will returnFalse.
Competition Conditions
Contention condition is a bug that occurs when the program results depend on which of the two or more threads first reach a specific code block. Multiple running programs produce different results, and any given running results are unpredictable.
A simple example of a race condition is to increment a field. Assume that a class has a privateStaticField (in Visual BasicShared), It increments every time you create an instance of this class. The code used is objct ++; (C #) or objct + = 1 (Visual Basic ). This operation requires that the value in objct be loaded into a register, so that the value is incremented, and then stored in objct.
In a multi-threaded application, a thread that has been loaded and incremented by this value may be preemptible by another thread, and the preemptive thread will execute all three steps; when the first thread continues to execute and stores its value, it overwrites the objct, but does not consider the fact that the value has been changed during its paused execution.
This special contention condition can be easily avoided by using the interlocked class method (such as interlocked: increment. To learn other techniques for synchronizing data between multiple threads, see processing synchronized data for multiple threads.
Contention conditions may also occur when multiple threads are synchronized. When writing each line of code, you must consider what will happen in the following special circumstances. The special situation here is that a thread is executing the code of this line (or any machine instruction that constitutes this line) other threads first executed the code.
Number of processors
Multi-thread programming technology can solve many problems of single-processor computers and multi-processor computers. Most single-processor computers are used to run end-user software, while multi-processor computers are usually used as servers.
Single processor computer
Multi-threaded programming provides computer users with better response capabilities and Uses idle time to process Background tasks. If multi-thread programming is used on a single processor computer, then:
There is only one thread running at any time.
The background thread is executed only when the main user thread is idle. The running foreground thread will not get the processing time of the background thread.
When a thread calls the thread: Start method, the thread will be executed only after the current thread ends or is preemptible by the operating system.
The reason for the contention condition is that the programmer does not foresee that a thread may be preemptible at an uncontrollable moment, sometimes the code block is used by another thread first.
Multi-processor computer
Multi-threaded programming provides a larger throughput. Ten processors can work ten times the workload of one processor, but only by separating tasks and allowing ten processors to work at the same time; threads provide a convenient way to divide tasks and take advantage of additional processing capabilities. If you use multi-threaded programming on a multi-processor computer, then:
The number of threads that can be concurrently executed depends on the number of processors.
The background thread is executed only when the number of foreground threads being executed is smaller than the number of processors.
When a thread calls the thread: Start method, the thread may or may not be executed immediately, depending on the number of processors and the number of threads waiting for execution.
The race condition may not only occur because the thread is accidentally preemptible, but also because two threads executed on different processors are using the same code block.
Static members and constructors
The class is not initialized until the class Constructor (static constructor in C # And shared sub new in Visual Basic) of the class is run. To prevent code execution for uninitialized types, before the class constructor completes running, static members (shared members in Visual Basic) from other threads to classes are prohibited during the runtime of the common language).
For example, if a class constructor starts a new thread and the thread process calls the static member of the class, the new thread will be disabled until the class constructor is completed.
The preceding conditions apply to any type of static constructor.
General suggestions
When using multithreading, consider the following principles:
Do not use thread: Abort to terminate other threads. Call another threadAbortIt is no different from the exception that causes the thread and does not know where the thread has been processed.
Do not use thread: suspend and thread: resume to synchronize the activity of multiple threads. Use mutex, manualresetevent, autoresetevent, and monitor.
Do not control the execution of auxiliary threads (such as using events) from the main program. Instead, the auxiliary threads should be responsible for waiting for tasks and executing tasks during program design, and other parts of the notification program upon completion. If you do not stop the auxiliary thread, consider using the thread pool thread. Monitor: pulseall will be helpful if the auxiliary thread is blocked.
Do not use the type as the Lock Object. For example, avoid using lock (typeof (x) code in C #, or using synclock (GetType (x) code in Visual Basic, or using monitor :: enter and type objects are used together. For a given type, each application domain has only one system: Type instance. If the type of the object you lock is public, code other than your code can also lock it, but it will lead to a deadlock. For more information, see reliability best practices.
Be cautious when locking an instance, for example, lock (this) in C # Or synclock (me) in Visual Basic ). If other code in your application that does not belong to this type locks the object, a deadlock will occur.
Make sure that the thread that has entered the monitor always leaves the monitor, even if an exception occurs when the thread is in the monitor. The lock Statement of C # And the synclock Statement of Visual Basic can automatically provide this action.FinallyTo ensure that the monitor: exit is called. If you cannot ensure that the callExit, Please consider changing your design to useMutex. Mutex is automatically released after the thread that owns it is terminated.
You must use multiple threads for tasks that require different resources to avoid specifying multiple threads for a single resource. For example, any task involving I/O will benefit from having its own thread, because this thread will be blocked during the I/O operation, thus allowing other threads to execute. User input is another resource that can benefit from a dedicated thread. On a single processor computer, tasks involving a large number of computing tasks can coexist with user input and tasks involving I/O, but multiple tasks with a large computing volume will compete with each other.
For simple state changes, consider using the interlocked class instead of the lock Statement (synclock in Visual Basic ). The lock statement is an excellent general tool, but the interlocked class provides better performance for atomic updates. If there is no competition, it will execute a lock prefix internally. When viewing the code, note the code similar to the code shown in the following example. In the first example, the state variable is incremental: recommended for the class library
When designing a multi-threaded programming library, consider the following principles:
If possible, avoid synchronization requirements. This is especially true for code that is widely used. For example, you can adjust an algorithm to tolerate contention, rather than completely eliminate contention. Unnecessary synchronization reduces performance and may lead to deadlocks and contention.
By default, static data (shared in Visual Basic) is thread-safe.
By default, instance data is not thread-safe. By adding locks to create thread-safe code, performance is reduced, lock contention is intensified, and deadlock may occur. In a common application model, there is only one thread at a time to execute user code, which can minimize the demand for thread security. For this reason, the. NET Framework class library is not thread-safe by default.
Avoid providing static methods that can change the static state. In common server solutions, static states are shared among requests, which means multiple threads can execute the code at the same time. In this way, a thread error may occur. Consider using a design mode that encapsulates data into instances that are not shared among requests. In addition, if static data is synchronized, calling between static methods that change the State can lead to deadlocks or redundant synchronization, thus reducing performance.