Thread Pool (ThreadPool)

Source: Internet
Author: User
Tags apm message queue

Thread pool (ThreadPool) https://www.cnblogs.com/jonins/p/9369927.html

Thread Pool Overview
A container that is maintained by the system to hold threads, shared by all the AppDomain controlled by the CLR. Thread pools can be used to perform tasks, send work items, handle asynchronous I/O, wait on behalf of other threads, and handle timers.

Thread pool and Threads
Performance: Each time a new thread is opened consumes memory space and resources (approximately 1 MB of memory by default), while the operating system must dispatch a running thread and perform context switching in multithreaded situations, too many threads are also bad for performance. The purpose of the thread pool is to reduce the resources that are consumed by opening new threads (using idle threads in the thread pool, eliminating the need to open new threads, and unifying administrative threads (after threads in the thread pool have finished executing, returning to the thread pool, waiting for new tasks)).

Time: Whenever a thread is started, it takes time (hundreds of milliseconds) to create a new local variable heap, and the thread pool pre-creates a set of recyclable threads, thus reducing the time to overload.

Thread pool Disadvantage: The performance loss of the thread pool is better than the thread (implemented by sharing and recycling threads), but:

1. The thread pool does not support interactive operations such as cancellation, completion, failure notification, and so on.

2. The thread pool does not support sequencing of threads execution.

3. The name of a pooled thread (thread Cheng) cannot be set, which increases the difficulty of debugging the code.

4. Pooled threads are usually background threads with a priority of threadpriority.normal.

5. Pooling thread blocking can affect performance (blocking causes the CLR to mistakenly assume that it is consuming a large amount of CPU.) The CLR is capable of detecting or compensating (injecting more threads into the pool), but this may make the thread pool more susceptible to subsequent overloads. Task solves this problem).

6. The thread pool uses a global queue, and the threads in the global queue are still competing for shared resources, which can affect performance (the task solves the problem scenario where the local queue is used).

How thread pooling Works
When the CLR initializes, there are no threads in the thread pool. Internally, the thread pool maintains an operation request queue. When an application executes an asynchronous operation, a record entry is appended to the queue of the thread pool. The thread pool's code reads records from this queue to send this record entry to a thread pool. If the thread pool has no threads, a new thread is created. When the thread pool thread finishes working, the threads are not destroyed, and instead the thread returns to the thread pool, where it enters an idle state, waits for another request, and no additional performance loss occurs because the thread does not destroy itself.

The program sends multiple requests to the thread pool, which attempts to service all requests using only one thread, and creates additional threads when the request is faster than the thread pool threads processing the task, so the thread pool does not have to create a large number of threads.

If you stop sending a task to the thread pool, a large number of idle threads in the pool will wake themselves up after a period of time to release the resources (different versions of the CLR define this event differently).

Worker thread &I/O Thread
The thread pool allows threads to schedule tasks on multiple CPU cores, enabling multiple threads to work concurrently, thus efficiently using system resources and increasing program throughput.

The CLR thread pool is divided between worker threads and I/O Threads:

Worker thread (workerthreads): Responsible for managing the operation of the CLR internal objects and providing "computational power", so it is often used to compute dense (compute-bound) operations.

I/O thread (completionportthreads): used primarily for exchanging information with external systems (such as reading a file) and distributing callbacks in IOCP.

Note: The thread pool pre-caches some worker threads because the cost of creating new threads is more expensive.

IO completion port (IOCP)
IO completion port (IOCP, I/O completion port): The IOCP is an asynchronous I/O API (which can be thought of as a message queue) and provides a threading model for handling multiple asynchronous I/O requests, which efficiently notifies the application of I/O events. IOCP is maintained internally by the CLR, and when an asynchronous IO request is completed, the device driver generates an I/O request packet (IRP, I/O request Packet) and queues (first in, first out) into the completion port. It is then extracted by the I/O thread to complete the IRP and invoke the previous delegate.

I/O thread &iocp&irp:

When I/O operations are performed (synchronous I/O operations and asynchronous I/O operations), the Windows API method is called to transform the current thread from the user state into the kernel state, while generating and initializing an I/O request package that contains a file handle, an offset, and an byte[] array. The I/O operation passes the request packet to the kernel, and according to this request package, the Windows kernel confirms which hardware device the I/O operation corresponds to. These I/O operations go into the device's own processing queue, which is maintained by the driver for this device.

In the case of a synchronous I/O operation, when the hardware device is operating I/O, the thread that issued the I/O request becomes asleep because of "waiting" (unattended processing), and then wakes up the thread when the hardware device finishes the operation. So the performance is not high, if the number of requests, then the number of dormant threads are also many, wasting a lot of resources.

In the case of asynchronous I/O operations (in. NET, asynchronous I/O operations start in beginxxx form, The internal implementation is threadpool.bindhandle and needs to pass in a delegate that is passed along with the IRP to the device's driver, which is returned when Windows sends an I/O request packet to the processing queue of the device. At the same time, the CLR assigns an available thread to continue with the next task, and when the task finishes, reminds the CLR that it has finished its work by IOCP, and then puts the delegate back into the CLR thread pool queue by the i\o thread when the notification is received.

So: In most cases, developers use worker threads, and I/O threads are called by the CLR (developers do not use them directly).

Base thread pool & worker thread (ThreadPool)
. NET uses the thread pool for the ThreadPool class, ThreadPool is a static class, defined in the System.Threading namespace, introduced from. NET 1.1.

Call method QueueUserWorkItem can put an asynchronous compute restriction operation into the thread pool's queue, which adds a work item and optional state data to the queue of the thread pool.
Work item: A method, identified by the callback parameter, that is called by the thread pool threads. You can pass a state argument to a method (more than one parameter needs to be encapsulated as an entity class).

1 public static bool QueueUserWorkItem (WaitCallback callBack);
2 public static bool QueueUserWorkItem (WaitCallback callBack, object state);
The following is an example of starting a worker thread through QueueUserWorkItem:

1 class Program
2 {
3 static void Main (string[] args)
4 {
5//Way One
6 {
7 ThreadPool.QueueUserWorkItem (n = Test ("Test-ok"));
8}
9//Way Two
10 {
One waitcallback waitcallback = new WaitCallback (Test);
ThreadPool.QueueUserWorkItem (n = waitcallback ("WaitCallback"));//both have the same effect ThreadPool.QueueUserWorkItem ( WaitCallback, "Test-ok");
13}
14//Way Three
15 {
Parameterizedthreadstart Parameterizedthreadstart = new Parameterizedthreadstart (Test);
ThreadPool.QueueUserWorkItem (n = Parameterizedthreadstart ("Parameterizedthreadstart"));
18}
19//Way Four
20 {
TimerCallback TimerCallback = new TimerCallback (Test);
ThreadPool.QueueUserWorkItem (n = timerCallback ("TimerCallback"));
23}
24//Way Five
25 {
Actionaction = Test;
ThreadPool.QueueUserWorkItem (n = Test ("Action"));
28}
29//Way Six
ThreadPool.QueueUserWorkItem (o) =
31 {
var msg = "Lambda";
Console.WriteLine ("Execute method: {0}", msg);
34});
35
36 ...
37
Console.readkey ();
39}
The static void Test (Object o)
41 {
Console.WriteLine ("Execution method: {0}", O);
43}
44/
$
Jonins
46 * Source: http://www.cnblogs.com/jonins/
47 */
48}
The results of the implementation are as follows:

The above is a few ways to use the thread pool, WaitCallback is essentially a delegate with no return value for the Parameter object type

1 public delegate void WaitCallback (object state);
Therefore, the type that meets the requirements can be passed as a parameter as described in the example code above.

Thread Pool Common methods
Several common methods of ThreadPool are as follows:

Method description
QueueUserWorkItem a thread (worker thread) constructor the boot line
GetMinThreads retrieves the minimum number of threads that the thread pool can create on demand in a new request forecast.
GetMaxThreads the maximum number of threads available, all requests that are greater than this number remain queued until the thread pool threads are idle.
GetAvailableThreads the number of idle threads remaining.
SetMaxThreads sets the maximum number of threads in the thread pool (the number of requests exceeds this value enters the queue).
SetMinThreads set the minimum number of threads that the thread pool needs to keep.
Example code:

1 static void Main (string[] args)
2 {
3//Declaration variable (worker thread count IO completion port count)
4 int workerthreadscount, completionportthreadscount;
5 {
6 Threadpool.getminthreads (out Workerthreadscount, out completionportthreadscount);
7 Console.WriteLine ("Minimum number of worker threads: {0}, minimum io threads {1}", Workerthreadscount, Completionportthreadscount);
8}
9 {
Ten threadpool.getmaxthreads (out Workerthreadscount, out completionportthreadscount);
Console.WriteLine (maximum number of worker threads: {0}, max io threads {1}, Workerthreadscount, Completionportthreadscount);
12}
ThreadPool.QueueUserWorkItem ((o) + = {
Console.WriteLine ("Takes up 1 pooled threads");
15});
16 {
Threadpool.getavailablethreads (out Workerthreadscount, out completionportthreadscount);
Console.WriteLine ("Number of remaining worker threads: {0}, number of remaining IO threads {1}", Workerthreadscount, Completionportthreadscount);
19}
Console.readkey ();
21}
Results of execution:

Attention:

1. Threads have memory overhead, so too much thread Cheng and not fully utilized is a waste of memory, so you need to limit the minimum number of threads to the thread pool.

2. The maximum number of threads in a thread pool is the maximum number of threads that can be created by thread pools, in fact the number of threads Cheng is created on demand.

I/O threads
The i\o thread is a thread introduced by. NET to access external resources in order to prevent the main thread from being blocked for a long time by accessing external resources. NET establishes asynchronous methods for multiple I/O operations. For example:

Filestream:beginread, BeginWrite. An asynchronous operation is initiated when Beginread/beginwrite is called. However, the BeginXxx method will use the default definition on the stream base class if the fileoptions.asynchronous parameter is passed in to create FileStream to get true IOCP support. The BeginXxx method in the stream base class initiates an asynchronous invocation using the BeginInvoke method of the delegate-this uses an additional thread to perform the task (not supported by IOCP, which may add additional performance loss).

Dns:begingethostbyname, Beginresolve.

Socket:beginaccept, BeginConnect, BeginReceive and so on.

Webrequest:begingetrequeststream, BeginGetResponse.

Sqlcommand:beginexecutereader, Beginexecutenonquery and so on. This is probably the most common asynchronous operation when developing a Web application. If you need to get IOCP support when performing database operations, you need to mark asynchronous processing as True (default to False) in the connection string, otherwise an exception will be thrown when the beginxxx operation is invoked.

Webservcie: For example, the BeginXxx method in. NET 2.0 or WCF-generated web Service proxy, InvokeAsync method of ClientBase in WCF .

These asynchronous methods are used in a similar way, starting with beginxxx (the internal implementation is threadpool.bindhandle) and ending with endxxx.

Attention:

1. For APM, you must use ENDXXX to end asynchronously, which may cause resource leaks.

2. The delegated BeginInvoke method does not receive IOCP support.

3.IOCP does not occupy threads.

The following is an example of using WebRequest to invoke an I/O thread for an asynchronous API:

1 class Program
2 {
3 static void Main (string[] args)
4 {
5 int workerthreadscount, COMPLETIONPORTTHREADSC Ount;
6 threadpool.getavailablethreads (out Workerthreadscount, out completionportthreadscount);
7 Console.WriteLine ("Number of remaining worker threads: {0}, number of remaining IO threads {1}", Workerthreadscount, Completionportthreadscount);
8//calls the Async API of the WebRequest class to occupy the IO thread
9 {
WebRequest WebRequest = httpwebrequest.create ("http://www.cnblogs.com /jonins ");
Webrequest.begingetresponse (result =>
) {
Thread.Sleep (+);
Console.WriteLine (Thread.CurrentThread.ManagedThreadId + ": Callback to perform final response");
WebResponse WebResponse = webrequest.endgetresponse (result);
+}, NULL);
17}
Thread.Sleep (1000);
Threadpool.getavailablethreads (out Workerthreadscount, out completionportthreadscount);
Console.WriteLine ("Number of remaining worker threads: {0}, number of remaining IO threads {1}", Workerthreadscount, Completionportthreadscount);
Console.readkey ();
22}
23} The
execution results are as follows:

Content points about I/O threads so far, I feel more like I am I/O operations, files and other aspects of knowledge points with the thread pool not much, want to learn more stamps: here

Execution context
Each thread has an execution context data structure associated with it, and the execution context (execution context) includes:

1. Security settings (compression stack, thread's principal attribute, Winodws identity).

2. Host settings (System.Threading.HostExecutionContextManager).

3. Logic calls the context data (System.Runtime.Remoting.Messaging.CallContext's Logicalgetdata and Logicalsetdata methods).

When a thread executes its code, some operations are affected by the thread execution context restrictions, especially the security settings.

When the main thread executes a task using worker threads, the former's execution context "flows" (replicates to) the worker thread, which ensures that any operations performed by the worker thread use the same security settings and host settings.

By default, the CLR automatically causes the execution context of the initialization thread to "flow" to any worker thread. However, this can have an impact on performance. It takes time to capture and replicate a large amount of information that is contained in the execution up and down to the worker thread, and if the worker thread takes more worker threads, it must also create and initialize more execution context data structures.

The ExecutionContext class of the System.Threading namespace, which allows the control of the flow of thread execution contexts:

1 class Program
2 {
3 static void Main (string[] args)
4 {
5//Put some data into the logical invocation context of the main function thread
6 Callcontext.logicalsetdata ("Action", "jonins");
7//Initialize some things to be done by another thread, thread pool threads can access logical context data
8 ThreadPool.QueueUserWorkItem (state = Console.WriteLine ("Worker thread A:" + Thread.CurrentThread.ManagedThreadId + "; Action={0} ", Callcontext.logicalgetdata (" Action "));
9//Now prevents the main thread from executing the context flow
Ten Executioncontext.suppressflow ();
11//Initialize some things to be done by another thread, thread pool threads can access logical context data
ThreadPool.QueueUserWorkItem (state = Console.WriteLine ("Worker thread B:" + Thread.CurrentThread.ManagedThreadId + "; Action={0} ", Callcontext.logicalgetdata (" Action "));
13//Recover execution context flow of the main thread to avoid using more thread pool threads
Executioncontext.restoreflow ();
Console.readkey ();
16}
17}
The results are as follows:

The ExecutionContext class blocks context flow to boost program performance, and performance gains can be significant for server applications. However, the performance of the client application does not improve much. In addition, because the Suppressflow method is marked with the [securitycritical] attribute, some clients, such as Silverlight, cannot be called.

Attention:

1. The worker thread should prevent the flow of the execution context when it does not need or access the context information.

2. The knowledge of performing context flow is also useful when working with task objects and initiating asynchronous I/O operations.

Three asynchronous patterns (literacy) &backgroundworker
1.apm&eap&tap
. NET supports three asynchronous programming modes, APM, EAP, and Tap, respectively:

1. Event-based Asynchronous Programming design pattern (eap,event-based Asynchronous Pattern)

The code naming of the EAP programming pattern has the following characteristics:

1. There is one or more methods named "[Xxx]async". These methods may create a synchronous version of the Mirror, which performs the same action on the current thread.
2. The class may also have a "[xxx]completed" event that listens to the results of an asynchronous method.
3. It may have a "[xxx]asynccancel" (or just CancelAsync) method that cancels the asynchronous operation in progress.

2. Asynchronous programming models (Apm,asynchronous programming model)

The code naming of APM's programming model has the following characteristics:

1. Asynchronous operations using the IAsyncResult design pattern are implemented by two methods named [BeginXXX] and [endxxx], starting and ending the asynchronous operation operation name, respectively. For example, the FileStream class provides BeginRead and EndRead methods to read bytes asynchronously from a file.

2. After calling [BeginXXX], the application can continue executing the instruction on the calling thread while the asynchronous operation executes on the other thread. Each time [BeginXXX] is called, the application should also call [endxxx] to get the result of the operation.

3. Task-based programming model (tap,task-based asynchronous Pattern)

Task and task based on the System.Threading.Tasks namespace , which is used to represent any asynchronous operation. Tap after the discussion. Detailed description of three asynchronous operations please poke: here

2.BackgroundWorker
BackgroundWorker is essentially using worker threads in the thread pool, but this class is redundant (understand). Appends a custom method to the DoWork property of BackgroundWorker and appends the custom method to the pooled thread by RunWorkerAsync.

DoWork is essentially an event. The delegate type is limited to no return value and has two parameters of type object and DoWorkEventArgs.

1 public event Doworkeventhandler DoWork;
2
3 public delegate void Doworkeventhandler (object sender, DoWorkEventArgs e);
Examples are as follows:

1 class Program
2 {
3 static void Main (string[] args)
4 {
5 int workerthreadscount, completionportthreadscount;
6 Threadpool.getavailablethreads (out Workerthreadscount, out completionportthreadscount);
7 Console.WriteLine ("Number of remaining worker threads: {0}, number of remaining IO threads {1}", Workerthreadscount, Completionportthreadscount);
8 {
9 BackgroundWorker BackgroundWorker = new BackgroundWorker ();
Ten backgroundworker.dowork + = DoWork;
Backgroundworker.runworkerasync ();
12}
Thread.Sleep (1000);
Threadpool.getavailablethreads (out Workerthreadscount, out completionportthreadscount);
Console.WriteLine ("Number of remaining worker threads: {0}, number of remaining IO threads {1}", Workerthreadscount, Completionportthreadscount);
Console.readkey ();
17}
private static void DoWork (object sender, DoWorkEventArgs e)
19 {
Thread.Sleep (2000);
Console.WriteLine ("Demo-ok");
22}
23}
The internal thread is occupied by threads, the result is as follows:

Conclusion
Programmers use thread pooling more to logically encode worker threads that use thread Cheng.

The thread pool (ThreadPool) ensures that the temporary overload of the compute-intensive job does not cause the CPU to overload (the number of threads activated is more than the number of CPU cores, and the system must perform thread scheduling on a time slice) relative to the individual operation thread (thread).

Overloading can affect performance because dividing the time slices requires a lot of context switching overhead and invalidates the CPU cache, which is the necessary scheduling for efficient processor implementation.

The CLR is able to sort the tasks and control the number of task launches, thus preventing the thread pool from overloading. The CLR first runs as many concurrent tasks as the number of hardware cores, and then adjusts the concurrency by climbing the algorithm to ensure that the program meets the optimal performance curve.

Reference documents
CLR via C # (4th edition) Jeffrey Richter

C # Advanced Programming (10th Edition) C # 6 &. NET Core 1.0 Christian Nagel

C # c#5.0 Authoritative guide In the Nutshell Joseph Albahari

http://www.cnblogs.com/dctit/

http://www.cnblogs.com/kissdodog/

http://www.cnblogs.com/JeffreyZhao/

...

Thread Pool (ThreadPool)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.