This article directory:
- Simple use of threads
- The difference between concurrency and asynchrony
- concurrency Control-Lock
- The signaling mechanism of the thread
- Threads in the thread pool
- Case: Asynchronous log components that support concurrency
Simple use of threads
Common concurrency and Asynchrony are mostly thread-based implementations, so this article first describes the simple use of threading.
Using threads, we need to reference the System.Threading namespace. The simplest way to create a thread is to take the new one thread and pass a ThreadStart delegate (with no arguments) or a Parameterizedthreadstart delegate (with parameters), as follows:
Class Program { static void Main (string[] args) { //Use parameterless delegate ThreadStart thread t = new thread (Go); T.start (); Use the Parameterizedthreadstart thread t2 = new Thread (Gowithparam) with the parameter delegate. T2. Start ("Message from Main."); T2. Join ();//wait for thread T2 to complete. Console.WriteLine ("Thread T2 has ended!"); Console.readkey (); } static void Go () { Console.WriteLine ("go!"); } static void Gowithparam (Object msg) { Console.WriteLine ("Go with param! Message: "+ msg"); Thread.Sleep (1000);//simulate time-consuming operation }}
Operation Result:
Thread usage, we just need to know so much. Let's talk about concurrency and asynchrony in a piece of code.
The difference between concurrency and asynchrony
For concurrency and Asynchrony, let's write a piece of code that simulates multiple threads writing 1000 logs at the same time:
Class Program { static void Main (string[] args) { thread T1 = new Thread (working); T1. Name = "Thread1"; Thread t2 = new Thread (working); T2. Name = "Thread2"; thread t3 = new thread (working); T3. Name = "Thread3"; Start 3 threads in turn. t1. Start (); T2. Start (); T3. Start (); Console.readkey (); } Each thread is working at the same time as static void working () { ///Simulate 1000 write log operations for (int i = 0; i <; i++) { // Asynchronous write Piece logger.write (Thread.CurrentThread.Name + "writes a log:" + i + ", on" + DateTime.Now.ToString () + ". \ n"); }//do some other events for (int i = 0; i <; i++) {}}}
The code is very simple, I believe everyone can understand. Logger you can think of it as a log-writing component that does not care about its implementation, as long as it is a component that provides a write-log function.
So what does this code have to do with concurrency and asynchrony?
Let's first describe this code in a single diagram:
Observation, 3 threads simultaneously call logger write log, for logger, 3 threads also give it the task, this situation is concurrency . For one of the threads, it is asynchronous to ask logger to help it write logs at some point while it is working, while continuing to work on its own.
(although I personally do not think it is misleading, the reader is not "misleading" the reader by giving feedback.) Before my definition and interpretation was not comprehensive, from the operating system and CPU hierarchy to distinguish between the two concepts. My article does not like to move textbooks, just want to use the plain and easy to read the vernacular let everybody understand, for the knowledge of professionalism and rigorous, now I understand the definition of concurrent and asynchronous delete, thank the friends of the enthusiastic discussion.
Next, let's continue with a few useful knowledge about threading and concurrency-locks, signaling mechanisms, and thread pooling.
concurrency Control-Lock
The CLR allocates its own memory heap space for each thread so that their local variables remain separate from each other.
Common data can also be shared between threads, such as a property of the same object or a global static variable. However, there are security issues in sharing data between threads. For example, the main thread and the new thread share the variable Done,done to identify that something has been done (tell other threads not to repeat):
Class Program { static bool is done; static void Main (string[] args) { new Thread (Go). Start (); Call go go on the new Thread ();//Call Go Console.readkey () on the main path ; } static void Go () { if (!done) { thread.sleep (500);//analog time-consuming Operation Console.WriteLine ("Done"); Done = True;}}}
Output Result:
Output of two "done", the event was made two times. Due to the lack of control concurrency, there is a thread security problem that cannot guarantee the state of the data.
To solve this problem, you need to use a lock (lock, also known as an exclusive lock or mutex). Using the lock statement, you can guarantee that shared data can only be accessed by one thread at a time. The lock's data object requires an object of a reference type that cannot be null, so the lock object must be guaranteed to be nullable. To do this you need to create a non-empty object to use the lock, modify the above code as follows:
Class Program { static bool is done; Static object locker = new Object (); !! static void Main (string[] args) { new Thread (Go). Start (); Call go go on the new Thread ();//Call Go Console.readkey () on the main path ; } static void Go () { lock (locker) { if (!done) { thread.sleep ($);//Doing something. Console.WriteLine ("Done"); Done = True;}}}
Look at the results again:
Using the lock, we solved the problem. But there is another thread-safety issue with locks, which is "deadlock", the probability of deadlocks is very small, but also to avoid. Ensuring that the "lock" operation is performed on one thread is one of the ways to avoid deadlocks, which is used in the following cases.
Here we do not go deep into the "deadlock", interested friends can go to query the relevant information.
The signaling mechanism of the thread
Sometimes you need a thread to start executing or wait when a signal is received, which is a signal-based event mechanism. The NET Framework provides a ManualResetEvent class to handle such events, and its Waione instance method allows the current thread to wait until a signal is received. Its set method is used to open the send signal. The following is an example of the use of a signaling mechanism:
static void Main (string[] args) { var signal = new ManualResetEvent (false); New Thread (() = { Console.WriteLine ("Waiting for signal ..."); Signal. WaitOne (); Signal. Dispose (); Console.WriteLine ("Got signal!"); }). Start (); Thread.Sleep (+); Signal. Set ();//Open "signal" console.readkey ();}
Operation Result:
When the set method is executed, the signal remains open, which can be closed by the reset method and freed by Dispose if it is no longer needed. If the expected wait time is short, you can use ManualResetEventSlim instead of ManualResetEvent, which performs better when the wait time is shorter. The signaling mechanism is very useful, and the subsequent log case will use it.
Threads in the thread pool
Threads in the thread pool are managed by the CLR. In the following two conditions, the thread pool works best:
- The task runs relatively short (<250ms), so the CLR can fully provision existing idle threads to handle the task;
- A task that is waiting (or blocking) a large amount of time does not dominate the thread pool.
There are two main ways to use threads in a thread:
Mode 1:task.run,.net Framework 4.5 only Task.run (() = Console.WriteLine ("Hello from the thread Pool");//Mode 2:threadpoo L.queueuserworkitemthreadpool.queueuserworkitem (t = Console.WriteLine ("Hello from the thread pool");
The thread pool allows threads to be used fully and efficiently, reducing the delay of task initiation. But not all cases are suitable for threads in the thread pool, such as the following log case-asynchronous write file.
The thread pool is here to give you a general idea of when to use threads in a thread pool and when not to. That is, threads in a pool of unused threads that take a long time or have a blocking situation.
Creating threads that do not walk through the thread pool can be created either directly from the new thread or through the following code:
Task task = Task.Factory.StartNew (() = ..., taskcreationoptions.longrunning);// Note The taskcreationoptions.longrunning parameter must be taken
Here the task is used, we do not care about it, follow-up Boven detailed.
There's a lot of knowledge about threading, and it's not getting any deeper, because that's enough for us to deal with web development.
Case: Asynchronous log components that support concurrency
We used a logger class in the "Concurrency and Async Differences" code above, and now we're going to do a logger like that.
Based on the above knowledge, we can implement the application's concurrent write log log function. In the application, writing logs is a common feature that simply analyzes the requirements of the feature:
- Executes asynchronously in the background, and does not affect other threads.
Depending on the two optimal usage conditions of the thread pool above, the write-log thread is in a blocking (or running wait) state for a long time, so it is not appropriate to use the thread pool. That is, you cannot use Task.run, but it is best to use new Thread.
- Concurrency is supported, where multiple tasks (distributed across different threads) can invoke the Write log function at the same time, but thread safety is required.
Support concurrency, it is necessary to use the lock, but to fully guarantee thread safety, it is necessary to find ways to avoid "deadlock." As long as we keep the "locked" Operation always done by the same thread to avoid the "deadlock" problem, the task of the concurrent request can only be placed in the queue by that thread in turn (because it is executed in the background and does not require immediate response to the user, so it is possible to do so).
- A single instance, a single thread.
Any local call to the Write log function invokes the same logger instance (which obviously does not create a new instance each time the log is written), which requires the use of singleton mode. Regardless of how many tasks call the Write Log feature, you must always use the same thread to handle these write log operations to ensure that you do not consume too much thread resources and avoid the delays caused by new threads.
Using the above knowledge, let's write a class like this. Simply take a look at the idea:
- A queue to hold the Write log task is required.
- A signaling mechanism is required to identify whether a new task is to be performed.
- When a new write log task is in place, the task is added to the queue and signaled.
- A method is used to process tasks in the queue, and when a new task signal is received, the tasks in the queue are called sequentially.
Before you develop a function, you need to have a simple idea to keep your heart inside. The specific development of the time will find problems, and then to supplement the expansion and refinement. At first, it was hard to think too well, first there was a simple idea, then the code was written!
Here is an initial implementation of such a logger class:
public class Logger {//For the queue that holds the write log task private queue<action> _queue; The thread used to write the log is private thread _loggingthread; Used to inform if there is a new log to write the "signal" private ManualResetEvent _hasnew; constructor, initialize. Private Logger () {_queue = new queue<action> (); _hasnew = new ManualResetEvent (false); _loggingthread = new Thread (Process); _loggingthread.isbackground = true; _loggingthread.start (); }//Using singleton mode, keep a Logger object private static readonly Logger _logger = new Logger (); private static Logger getinstance () {/* Unsafe code Lock (locker) {if (_logger = = null) { _logger = new Logger (); }}*/return _logger; }//The task in the processing queue private void process () {while (true) {//waits to receive a signal, blocking the thread. _hasnew.waitone (); After receiving the signal, reset the "signal" and the signal is off. _hasnew.reset (); Because the tasks in the queue can increase rapidly, the wait here is to handle more tasks at a time, reducing the frequent "in and out" operations on the queue. Thread.Sleep (100); Begins the execution of a task in the queue. Because there may be new tasks in the execution, it is not possible to operate directly on the original _queue, or//to empty the tasks in the _queue by first copying them and then manipulating the copy. Queue<action> queuecopy; Lock (_queue) {queuecopy = new queue<action> (_queue); _queue. Clear (); } foreach (var action in queuecopy) {action (); }}} private void Writelog (string content) {Lock (_queue) {//TODO: There is a thread-safety issue, and blocking may occur. Adds a task to the queue _queue. Enqueue (() = File.appendalltext ("Log.txt", content)); }//Open "signal" _hasnew.set (); }//Exposes a Write method for external calls to public static void Write (string content) {//Writelog method simply adds a task to the queue with a very short execution time, so use Task.run 。 Task.run (() = getinstance (). Writelog (content)); }}
Class is written, test the logger class with the code in the "Concurrency and Async differences" section above, and run a result on my computer:
A total of 3,000 logs, the result is no problem.
The above logger class comments are written in detail, I will no longer parse.
With this example, the goal is to get a handle on the basic applications of threading and concurrency in development and the issues to be aware of.
Unfortunately, this logger class is not perfect, and there is a thread-safety problem (marked in red in the code), although the actual environment probability is small. It may be difficult to see that the above code runs more than once (I have run without exception), but adding several threads at the same time may be problematic.
So how do you solve this thread-safety problem?
Reference Address: http://www.cnblogs.com/kesimin/p/5085460.html
C # Advanced Knowledge Point Overview (2)-Thread concurrency Lock