Multithreading Practice--Overview and concepts

Source: Internet
Author: User

  C # supports executing code in parallel through multithreading, where a thread has its own execution path and can run concurrently with other threads. A C # program starts with a single thread, which is created automatically by the CLR and the operating system (also known as the "Main thread") and has multithreading to create additional threads.  

A simple example is as follows:

usingSystem;usingSystem.Threading;classThreaddemo {Static voidMain () {Thread T=NewThread (Writey);                                 T.start ();  while(true) Console.Write ("x"); }        Static voidWritey () { while(true) Console.Write ("y"); }}

The main thread creates a new threads "T", which runs a repeating method of printing the letter "Y", while the main thread repeats the letter "X". The CLR allocates each thread to its own memory stack to ensure the separation of local variables runs. In the next method, we define a local variable, and then call this method at the same time on the main thread and the newly created one.

Static void Main () {  new Thread (Go). Start ();        Go ();                         }  staticvoid  Go () {for   (int0 5
Console.Write ('? ' );}

A copy of the

variable cycles is created in the respective memory stack, and the output is the same, and it can be predicted that there will be 10 question marks output. When threads refer to some common target instances, they share the data. The example is as follows:

Classbool done ;    Static void Main () {   new threadtest ();      New Thread (TT. Go). Start ();   Tt. Go (); }  void  Go () {   iftrue; Console.WriteLine ("done"); }}}

In the example above, two threads call the Go method, which shares the done field, and the result outputs a "done" instead of two.

The

Static field provides another way to share data between threads, and here is an example of done as a static field:

 class   Threaddemo 
{ static bool done; //   static fields are shared across all threads static void Main ()
{ new Thread (Go). Start (); Go (); static void Go ()
{ if (!done) {done = true ; Console.WriteLine ( done " ); } }}

These two examples suffice to illustrate another key concept, namely, thread safety (or, conversely, its shortcomings!) The output is actually indeterminate: it may (though improbable), "done", can be printed two times. However, if we change the order of instructions in the Go method, the chance of "done" being printed two times will rise substantially, such as:

Static void Go () {  if (!done) {Console.WriteLine ("done"true;}} 

The problem is that when a thread is judging the if block, just the other thread is executing the WriteLine statement--before it sets done to true. The remedy is to provide an exclusive lock when reading and writing public fields, and C # provides a lock statement to achieve this purpose:

classThreadSafe
{ Static BOOLDone ; Static ObjectLocker =New Object(); Static voidMain ()
{ NewThread (Go). Start (); Go (); } Static voidGo ()
{ Lock(Locker)
{ if(!done) {Console.WriteLine (" Done"); Done =true; } } }}

When two threads scramble for a lock (in this case, locker), a thread waits, or is blocked , the lock becomes available. In this case, it is ensured that only one thread can enter the critical section at the same time, so "done" is only printed 1 times. The code is called thread-safe in this way in an uncertain multithreaded environment.

Temporary pauses, or blocking is a multi-threaded collaborative effort that synchronizes the essential characteristics of the activity. Waiting for an exclusive lock to be released is the cause of a thread being blocked, and another reason is that the thread wants to pause or Sleep for a while:

Thread.Sleep (Timespan.fromseconds ());  

A thread can also use its Join method to wait for another thread to end:

New Thread (Go);           T.start (); T.join ();   

A thread that, once blocked, no longer consumes CPU resources.

How the thread works

A thread is managed by a thread coordinator--a function that the CLR delegates to the operating system. The thread coordinator ensures that all active threads are assigned the appropriate execution time, and that the waiting or blocking threads-such as in an exclusive lock or user input-do not consume CPU time.

On a single-core processor computer, the thread coordinator completes a time slice and quickly switches execution between active threads. This leads to "choppy" behavior, such as in the first example, where each repetition of x or Y blocks corresponds to a time slice that is divided into threads. In Windows XP, time slices are typically more expensive to select in 10 milliseconds than CPU overhead when dealing with thread switching. (i.e. usually in a few microsecond intervals)

In multicore computers, multithreading is implemented as a mix of time slices and real concurrency--different threads running on different CPUs. This can almost certainly still occur some time slices, as the operating system needs to serve its own threads, as well as some other applications.

Threads that are interrupted by external factors (such as time slices) are called preemption, and in most cases a thread loses control of it at the moment it is preempted.

Thread vs. process

All threads that belong to a single application are logically contained in a process, which refers to the operating system unit that an application is running.

Threads are similar in some way to processes: for example, a process typically runs in a time-slice manner with other processes running on a computer in much the same way as a C # program thread. The key difference between the two is that the process is completely isolated from each other. Threads Share (heap heap) memory with other threads running in the same program, which is why threads are so useful: one thread can read data in the background, while another thread can present the data that has been read in the foreground.

When to use multithreading

Multithreaded programs are typically used to perform time-consuming tasks in the background. The main thread keeps running, and the worker thread does its background work. For Windows Forms programs, if the main thread attempts to perform lengthy operations, the keyboard and mouse operations become dull and the program loses its response. For this reason, you should add a worker thread when running a time-consuming task in a worker thread, even if there is a good hint in "processing ..." on the main thread to prevent the work from continuing. This avoids the program's "no corresponding" prompt by the operating system to persuade the user to force the process to end the program, which results in an error. The modal dialog box also allows the "cancel" feature to be implemented, allowing the continuation of receiving events, while the actual task has been completed by the worker thread. BackgroundWorker happens to assist in this function.

In programs that do not have a user interface, such as Windows Service, multithreading is potentially time-consuming when a task is implemented, because it is particularly meaningful to wait for a response from another computer, such as an application server, a database server, or a client. Accomplishing tasks with a worker thread means that the main thread can do other things right away.

Another use of multithreading is to do a complex computational work in a method. This method will run faster on multicore computers, if the workload is separated by multiple threads (using the Environment.processorcount property to detect the number of processing chips).

A C # program called Multithreading can be done in 2 ways: explicitly creating and running multi-threading, or using the. NET framework to secretly use multithreaded features-such as the BackgroundWorker class, the thread pool , Threading Timer, remote server, or Web services or ASP. In the latter case, people have no choice but to use multithreading; a single-threaded ASP. NET Web server is not too cool, even if there is such a thing; Fortunately, multithreading is quite common in application servers, and the only thing worth worrying about is the static variable problem that provides the appropriate locking mechanism.

When not to use multithreading

Multithreading is also a disadvantage, the biggest problem is that it makes the program is too complex, the multi-threaded itself is not complex, the complexity is the interaction of the thread, which brings a long period of development, regardless of whether the interaction is intentional, will bring a longer cycle, and bring intermittent and non-repeatable bugs. Therefore, either multithreaded interaction design is simpler or does not use multithreading at all. Unless you have a strong desire to rewrite and debug.

Multithreading can lead to increased resource and CPU overhead when users frequently allocate and switch threads. In some cases, too many I/O operations are tricky when there are only one or two worker threads that are more than the same time that there are many threads executing the task block. We will implement the producer/consumer queue later, which provides the functionality described above.

Multithreading Practice--Overview and concepts

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.