Recently, because of the overall upgrade of the infrastructure, it is necessary to update the DLL files for all related projects. This process is not a small risk, so the post-release production server has been closely monitored, the result is the exception of individual applications, and quickly occupied a large number of server memory and CPU resources. By studying dump, the initial discovery was due to a single point of failure in the configuration server, and then an exception occurred when the application was called by multi-threaded related SOA services, causing the ThreadAbortException exception, and because the original exception handling code is not rigorous, And with the asynchronous sending of alarm messages tightly together, resulting in the number of threads in the geometry of the increase, eventually making the entire server unusable. The introduction here is not too clear, and the relevant reasons, although all have a certain persuasive, but insufficient evidence, fortunately, finally through the reconstruction, take off the multi-threaded operation, the service is back to normal. But whatever it is, it's really good to learn about multithreading in the. NET CLR. One of the senior architects around us suggested that you try not to create threads if you really need to control the number of threads and to be traceable. In addition, if the CLR is hosted in IIS, the thread pool has many limitations and is common to all AppDomain shares in the CLR and is prone to unexpected errors that are recommended for use. NET new asynchronous model TPL.
In the CLR book, threading-related content is mainly divided into 5 parts: thread-related fundamentals, asynchronous operations that compute limits, one-step operations for I/O throttling, basic thread synchronization variables, and mixed thread synchronization variables. Although this classification is not used in this article, this classification is helpful in creating an organic whole in mind for related concepts.
Process is a basic concept in the operating system that contains all the resources needed to run a program. Processes are independent of each other and have their own memory area, which can be considered as the basic unit of the program running independently. At design time, Windows ensures the robustness of the program by giving each process its own virtual address space, ensuring that one process cannot access the code of another process. Windows is a preemptive (preempt) multithreaded operating system that uses time slices to handle the CPU contention of processes (threads).
An application domain (AppDomain) is a concept under a Windows system that is a logical area for a program to run. NET is the assembly that runs in the application domain, and a process can contain multiple application domains.
Thread is the basic unit of execution in a process, and the first thread that executes at the process entrance is considered the thread of the process. In. NET applications, the main () method is used as the portal, and when this method is called, the system automatically creates a main thread. Threads are primarily composed of CPU registers, call stacks, and thread local memory (thread locally storage,tls). The CPU registers primarily record the state of the currently executing thread, and the call stack is primarily used to maintain the memory and data that the thread invokes, and TLS is primarily used to hold the thread's state information. Threads can be seen as virtualization of CPUs, and threads consist of 5 main elements:
- A thread kernel object that contains a set of properties that describe the thread and the thread context;
- The thread environment block, which contains the thread exception handling head, each try block entered by the thread inserts a node in the head, which is why threadabortexception this particular exception will be thrown again at the end of each catch;
- User-mode stack, which stores local variables and arguments passed to the method, the default allocated space is 1MB, the largest part
- Kernel-mode stack, which is used when calling the kernel API
- DLL thread connection and detach notifications, each time Windows creates a thread it loads all of the DLL's ingress methods and passes a Dll_thread_attach method that, when loading DLLs many of them, this operation can result in a great performance drain.
In addition, when the CLR performs garbage collection, the CLR must suspend all threads and traverse their stacks to mark objects in the heap, because a large number of threads have a very large impact on the performance of garbage collection, building up resources, and recycling resources, so you need to consider them very carefully. Of course, parallel computing in multicore situations is really attractive. There are 32 levels of thread priority under Windows, but we usually use a simplified level 5 priority, and the actual default is normal.
- System.Threading.Thread class
System.Threading.Thread is the base class for controlling threads, which can be used to control the creation, suspend, stop, and destroy of thread in the current application domain.
It includes the following common common properties:
| property |
explanation |
CurrentContext |
Gets the current context in which the thread is executing. |
CurrentThread |
Gets the thread that is currently running. |
ExecutionContext |
Gets a ExecutionContext object that contains information about the various contexts of the current thread. |
IsAlive |
Gets a value that indicates the execution state of the current thread. |
IsBackground |
Gets or sets a value that indicates whether a thread is a background thread. |
Isthreadpoolthread |
Gets a value that indicates whether the thread belongs to a managed thread pool. |
Managedthreadid |
Gets the unique identifier of the current managed thread. |
Name |
Gets or sets the name of the thread. |
Priority |
Gets or sets a value that indicates the scheduling priority of the thread. |
ThreadState |
Gets a value that contains the state of the current thread. |
An application domain may include multiple contexts, while the current context of the thread can be obtained through CurrentContext, and CurrentThread is the most commonly used property, which is used to get the currently running thread.
By ThreadState you can detect that a thread is in the state of unstarted, sleeping, Running, and so on, and it provides more specific information than the IsAlive attribute, and can change the state of the thread in the following ways:
- Suspend thread: Sleep () and Suspend (), which suspend for a specified time, which is always pending before recovery, use a combination of suspend and resume carefully. Because once a thread occupies an existing resource, and then uses suspend () to keep the thread in a suspended state for a long time, it causes a deadlock when other threads invoke those resources! Therefore, the use of these two methods should be avoided in the absence of necessary conditions. In addition, when you cannot predict the time that an asynchronous thread needs to run, blocking the main thread through Thread.Sleep (int) is not a good solution, but should use Thread.Join () to ensure that the main thread will not terminate until the thread thread of asynchronous threads has finished running.
- Terminating thread: You can use the Abort () method if you want to terminate a running thread. When you use Abort (), a special exception ThreadAbortException is thrown. If you want to resume the execution of a thread before it terminates, you can call Thread.resetabort () in the catch (ThreadAbortException ex) {...} after catching the exception to abort. Using Thread.Join () ensures that the application domain waits for the asynchronous thread to end before terminating the run.
- ThreadStart, Parameterizedthreadstart delegate class.
Creating a new thread through ThreadStart is the most straightforward approach, but creating a thread that is more difficult to manage, and if you create too many threads, will slow down the performance of your system (excessive thread-to-top switching), so use caution.
When the CLR initializes, there is no thread in the thread pool, it maintains an operation request queue internally, and when the application wants to perform an asynchronous operation, it invokes a method that appends a record entry (entry) to the queue of the thread pool. The thread pool code extracts the record entries from this queue and dispatches them to a thread. If a wood thread is created, after the task is completed, it is not destroyed by itself, but instead is returned to the thread pool in a suspended state. The suspended thread in the thread pool activates the execution of the task again until the application makes a request to the thread pool again. This saves the performance loss of building threads, and allows multiple tasks to reuse the same thread over and over again, saving significant overhead during the lifetime of the application.
The thread pool divides its own threads into worker threads (workers) and IO threads (completionportthread), which are primarily used to manage the operation of internal CLR objects, which are used to exchange information with external systems, and the simple thread pool method is as follows:
Method |
explanation |
| QueueUserWorkItem (WaitCallback callback, object state) |
add a work item to the thread pool queue, parameter 1 is the callback delegate, parameter 2 is the parameter of the delegate |
| getmaxthreads (out int workerthreads,out int completionportthreads) |
get maximum threads |
SetMaxThreads (int workerthreads, int completionportthreads) |
Set maximum number of threads |
By Get/setmaxthreads two methods, you can read and set the maximum number of threads in the CLR thread pool for worker threads and I/O threads, respectively. The maximum number of threads in Framewok4.0 is the default number of 250*cpu, usually around 1000, the local situation is as follows:
Thread pool usage requires attention:
Threads established through the CLR thread pool are always default to background threads, with priority progression of threadpriority.normal.
You cannot set the number of worker threads or the number of I/O completion threads to less than the number of processors on the computer.
If the common language runtime is hosted, such as by IIS or SQL Server, the host may limit or disallow changes to the thread pool size.
Be cautious about the maximum number of threads in a thread pool. While such changes may be beneficial to your code, they may have a detrimental effect on the codebase you use.
Setting the thread pool size too large can cause performance problems. If there are too many threads executing at the same time, the task switching overhead becomes a major factor affecting performance.
ThreadAbortException
When the Abort method is called to destroy a thread, the common language runtime throws ThreadAbortException. ThreadAbortException is a special exception that can be caught, but it will automatically be raised again at the end of the catch block. When this exception is thrown, the runtime executes all finally blocks before the thread ends. Because the thread can perform an unbound calculation in a finally block or call Thread.resetabort to cancel the abort, there is no guarantee that the thread will end completely. If you want to wait until the terminated thread ends, you can call the Thread.Join method. A join is a blocking call that is not returned until the thread actually stops executing.
In the wrong use
Each thread is associated with an execution context data structure that includes security settings (principal properties and Windows identities), host settings (Hostexecutioncontextmanager), and logical call context ( CallContext) Logicalsetdata and Logicgetdata methods, we can set the context content of the thread to not flow, to reduce the cost of resources, followed by a simple example to understand.
public void Test () { Callcontext.logicalsetdata ("name", "Xionger"); ThreadPool.QueueUserWorkItem (s = = Console.WriteLine ("Name; {0} ", Callcontext.logicalgetdata (" name ")); Blocking the flow of thread contexts Executioncontext.suppressflow (); ThreadPool.QueueUserWorkItem (s = = Console.WriteLine ("Name; {0} ", Callcontext.logicalgetdata (" name ")); Resuming the flow of thread contexts Executioncontext.restoreflow (); } |
- Complete port model (a very old Win32 concept that can be ignored)
Before you can see the name of the I/O thread called the completionportthreads completion port thread, which is actually an asynchronous IO model under Windows, you can actually view the completion port as a queue for system maintenance, and the operating system puts the event notification completed by the overlapped IO operation into the team joins Named "Completion port" (completion Ports) because it is an event notification that exposes "operation complete". Once a socket is created, it can be linked to a completion port at any time.
In general, an application can create multiple worker threads to handle notification events on the completion port. The number of worker threads depends on the specific needs of the program. However, ideally, you should create a thread for one CPU. Because in the completion port ideal model, each thread can get an "atomic" time slice from the system, run and check the completion port in turns, and the thread switching is an additional overhead. In actual development, it is also considered whether these threads are involved in other blocking operations. If a thread is blocking, the system suspends it, allowing other threads to run for the time. Therefore, if you have such a situation, you can create more threads to make the most of your time.
In summary, it is not very difficult to develop a scalable Winsock server. The main is to start a listening socket, receive connections, and make overlapping send and receive IO operations. The biggest challenge is to manage system resources, limit the number of overlapping IO, and avoid memory crises. Following these principles will help you develop high-performance, extensible service programs. The receive buffer for the socket, because the receive event occurs only in the AcceptEx call. Ensuring that each socket has a receive buffer does not pose any harm. Once the client/server interacts after the initial request (completed by AcceptEx) and sends more data, it is a bad idea to cancel the receive buffer. Unless you can guarantee that this data is done in the overlapped IO receive for each connection.
Resources:
- Jeffrey, Richter. CLR via C#[m]. Beijing : Tsinghua University Press , China.
- Dust Prodigal . elaborate multithreading [Eb/ol]. http://www.cnblogs.com/leslies2/archive/2012/02/07/2310495.html.
. NET threading Control Quick Learning 01