C # Multi-threaded automatic management (thread pool) task-based approach

Source: Internet
Author: User

C # Multi-Threading automatic management (thread pool)

In multi-threaded programs, there are often two situations:
1. Application threads spend most of their time waiting, waiting for an event to occur, and then giving a response. This is generally addressed using the ThreadPool (thread pool).
2. The line accesses tile is dormant and is only periodically awakened. This is usually solved by using a timer (timer).

The ThreadPool class provides a system-maintained thread pool (which can be seen as a container for a thread) that requires more than Windows 2000 system support, because some of these methods call API functions that are only available in higher versions of Windows.

Placing threads in the online pool requires the ThreadPool.QueueUserWorkItem () method, which is prototyped as follows:
Puts a thread into the thread pool, and the Start () method of the threads invokes the function represented by the WaitCallback proxy object
public static bool QueueUserWorkItem (WaitCallback);
The overloaded method is as follows, and the parameter object is passed to the method represented by WaitCallback
public static bool QueueUserWorkItem (WaitCallback, object);
Note:
The ThreadPool class is a static class, and you cannot and do not have to build its objects. And once a project is added to the thread pool using this method, the item cannot be canceled. Here you do not have to build a thread yourself, just write the work you want to do as a function, and then pass it as a parameter to the ThreadPool.QueueUserWorkItem () method, the way to pass is to rely on the WaitCallback proxy object, and thread building, management, The operation is done automatically by the system, you do not have to consider the complex details of the problem.

usage of ThreadPool:
First the program creates a ManualResetEvent object, which is like a semaphore that can use its signal to notify other threads. In this example, when all the threads in the thread pool have finished working, the ManualResetEvent object is set to signaled to inform the main thread to continue running.
There are several important ways to ManualResetEvent objects:
When the object is initialized, the user can specify its default state (signal/no signal);
After initialization, the object will remain in its original state until its Reset () or Set () method is called:
Reset ():
Set it to no signal state;
Set ():
Set it to a signaled state.
WaitOne ():
Suspends the current thread until the ManualResetEvent object is in a signaled state, at which point the thread is activated. The program then adds work items to the thread pool, which are provided as functions by the system to initialize the automatically created thread. When all the threads have run out, the Manualresetevent.set () method is called, because the Manualresetevent.waitone () method is called and the main thread in the wait state receives the signal, and then it goes down, Finish the work behind it.

Using System;
Using System.Collections;
Using System.Threading;
Namespace Threadexample
{
    <summary>
    This is the data structure used to hold the information, which is passed as a parameter
    </summary>
    public class Somestate
    {
        public int Cookie;
        Public somestate (int icookie)
        {
            Cookie = Icookie;
        }
    }
    public class Alpha
    {
        Public Hashtable Hashcount;
        public ManualResetEvent Eventx;
        public static int iCount = 0;
        public static int imaxcount = 0;
        Public Alpha (int MaxCount)
        {
            Hashcount = new Hashtable (MaxCount);
            Imaxcount = MaxCount;
        }
        <summary>
        Thread constructor will call the Beta () method
        </summary>
        
        public void Beta (Object state)
        {
            
            Console.WriteLine ("{0} {1}:", Thread.CurrentThread.GetHashCode (), ((somestate) state). Cookies);
            
                Thread.CurrentThread.GetHashCode ());
            Lock (Hashcount)
            {
                
                if (! Hashcount.containskey (Thread.CurrentThread.GetHashCode ()))
                    Hashcount.add (Thread.CurrentThread.GetHashCode (), 0);
                Hashcount[thread.currentthread.gethashcode ()] = ((int) hashcount[thread.currentthread.gethashcode ()]) + 1;
            }
            Thread.Sleep (2000);
            
            Interlocked.Increment (ref ICount);
            if (ICount = = Imaxcount)
            {
                Console.WriteLine ();
                
                Eventx.set ();
            }
        }
    }
    public class Simplepool
    {
        public static void Main (string[] args)
        {
            Console.WriteLine ("Thread Pool Sample:");
            
            BOOL W2K = false;
            
            int MaxCount = 10;
            
            ManualResetEvent eventx = new ManualResetEvent (false);
            Console.WriteLine ("Queuing {0} items to Thread Pool", MaxCount);
            
            Alpha Oalpha = new Alpha (MaxCount);            
            Oalpha.eventx = Eventx;
            Console.WriteLine ("Queue to Thread Pool 0");
            Try
            {
                
                
                ThreadPool.QueueUserWorkItem (New WaitCallback (Oalpha.beta), new Somestate (0));
                W2K = true;
            }
            catch (NotSupportedException)
            {
                Console.WriteLine ("These API ' s may fail when called on a Non-wind ows system.");
                W2K = false;
            }
            
            {
                for (int iItem = 1; iItem < MaxCount; iitem++)
                {
                    
                    Console.WriteLine ("Queue to Thread Pool {0}", IItem);
                    ThreadPool.QueueUserWorkItem (New WaitCallback (Oalpha.beta), New Somestate (IItem));
                }
                Console.WriteLine ("Waiting for Thread Pool to drain");
                
                
                Eventx.waitone (Timeout.infinite, true);
                
                Console.WriteLine ("Thread Pool has been drained (Event fired)");
                Console.WriteLine ();
                Console.WriteLine ("Load Across Threads");
                foreach (Object o in OAlpha.HashCount.Keys)
                {
                    Console.WriteLine ("{0} {1}", O, Oalpha.hashcount[o]);
                }
            }
            Console.ReadLine ();
        }
    }
}

Where the program should draw attention:
The Somestate class is a data structure that holds information, which is passed to each thread as a parameter, because you need to encapsulate some useful information to provide to the thread, which is very effective.
The Interlocked class that appears in the program is also designed for multithreaded programs, and it provides some useful atomic manipulation. atomic operation: In a multithreaded program, if the thread calls this operation to modify a variable, the other thread cannot modify the variable, which is essentially the same as the lock keyword.

Maximum number of threads that can be allowed in the Windows operating system

By default, a thread's stack is reserved for 1M of memory space
While the memory space available in a process is only 2 G, it is theoretically possible to open up to 2048 threads in a process
But of course the memory is not completely taken as a thread stack, so the actual number is smaller than this value.
You can also change the default stack size by connecting, and make it smaller so that you can open more threads.
If you change the size of the default stack to 512K, you can theoretically open up to 4,096 threads.

Even if the physical memory is large, the thread that can be played in a process is always limited by the memory space of 2GB.
Let's say your machine has 64GB of physical memory, but each process has a memory space of 4GB, where the user state is still available in 2GB.


If it is within the same machine, the number of threads that can be set up is also limited by memory. Each thread object has to stand on non-paged memory, not page memory, and when non-page memory is exhausted, the thread cannot be created.

If the physical memory is very large, the limit on the number of threads that can be run within the same machine is increasing.

Write a program under Windows, a process fork out 2000 left and right threads will be abnormally exited, why?

This problem arises because of the WINDOWS32 bit system, where the maximum virtual memory that a process can use is 2G, and the default line stacks stacksize for a thread is 1024K (1M), so that when the number of threads approaches 2000, 2000*1024k= 2G (approximately), memory resources are equivalent to exhaustion.

MSDN Original:

"The number of threads a process can create is limited by the available virtual memory. By default, the every thread has one megabyte of stack space. Therefore, you can-create at the most 2,028 threads. If you reduce the default stack size, you can create more threads. However, your application would have better performance if you create one thread per processor and build queues of requests For which the application maintains the context information. A thread would process all requests in a queue before processing requests in the next queue. "

How do I break 2000 limits?

You can reduce the line stacks stacksize by modifying the CreateThread parameter, for example

#define   Max_threads   50000
DWORD   WINAPI   threadproc (   lpvoid   lpparam   ) {
while (1) {
Sleep (100000);
}
return   0;
}
int   Main ()   {
DWORD   Dwthreadid[max_threads];
HANDLE   Hthread[max_threads];
for (int   i   =   0;   I   <   max_threads;   ++i)
{
Hthread[i]  = CreateThread (0, +  , ThreadProc, 0, stack_size_param_is_a_reservation,   &dwthreadid[ I]);
if (0   = =   Hthread[i])
{
DWORD   E   =   GetLastError ();
printf ("%d\r\n", e);
Break
}
}
ThreadProc (0);
}

Server-side programming

If your server is programmed to create a thread for a client connection request, there are 2000 restrictions (in the case of hardware memory and CPU count). The recommendations are as follows:

The "one thread per client" model is well-known does not be beyond a dozen clients or so. If you ' re going to being handling more than that many clients simultaneously, you should move to a model where instead of ded Icating a thread to a client, you instead allocate an object. (Someday I ' ll muse on the duality between threads and objects.) Windows provides I/O completion ports and a thread pool to help you convert from a thread-based model to a work-i Tem-based model.

1. Serve many clients with each thread, and use nonblocking I/O and level-triggeredreadiness NOTIFICATION2. Serve many clients with each thread, and use nonblocking I/O and readiness Change notification3. Serve Many clients with all server thread, and use asynchronous I/O

C # Multi-threaded automatic management (thread pool) task-based approach

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.