Reading notes-clr via C # thread 25-26 Chapter

Source: Internet
Author: User
Tags apm message queue switches

Objective

This book in the past few years fragmented read two or three times, as a classic book, should repeat read and reread, since I began to write Bo, I am also ready to think of the classic good book reread carefully read it again, and put the notes into the blog, good memory than bad writing, but also in the process can deepen their understanding of the depth, And, of course, sharing it with friends from the technology community.

Thread
    • Internal thread composition
      • The thread kernel object thread kernel an object, which contains a set of properties that describe the thread. The data structure also includes the so-called thread context thread contexts. The context is a block of memory that contains a collection of registers for the CPU that consumes hundreds of to thousands of bytes of memory
      • Thread environment block thread environment BLOCK,TEB is the allocation and initialization of a block of memory in user mode (the address space that the application code can quickly access). TEB consumes 1 pages of memory. (8KB in 4kb,ia64cpu in 86 and 64). TEB contains the top of the thread's exception handling chain. A thread enters each try block and inserts a node at the top of the chain. When the thread exits the try block, the node is removed from the chain. TEB also contains thread-local storage data for threads and some data structures used by GDI and OpenGL images
      • A user-mode stack that stores local variables and arguments passed to the method. It also contains an address that indicates where the thread should start executing when the current method returns. By default, Windows allocates 1MB of memory for each thread's user-mode stack
      • Kernel-mode stacks, the kernel-mode stack is also used when application code passes arguments to a kernel-mode function in the operating system. 32-bit Windows occupies 12kb,64 bit 24KB
      • DLL thread connection and detach notifications, attach and detach notifications, managed DLLs do not receive notifications, which improves performance
    • Design for threading: Reliability + Responsiveness in exchange for Speed + performance
    • Usage scenarios for dedicated threads
      • Non-normal priority
      • Foreground thread
      • Long Running time
      • Requires an explicit termination
Context Switches
    • Context Switch steps:
      • Saves the value in the CPU register to a context structure inside the kernel object of the currently running thread
      • Select a thread schedule from the existing thread collection, if the thread is owned by another process, Windows must also switch the virtual address space of the CPU "see" Before executing code or touching data
      • Loads the values in the selected context structure into the registers of the CPU
    • Windows performs a context switch approximately every 30 milliseconds. The net cost of the context, by sacrificing performance in exchange for a good user experience
    • The time required to perform a context switch depends on the architecture and speed of the CPU
    • If you want to build high-performance applications, you should try to avoid context switching
Asynchronous operation to calculate limits
    • Pros: 1) Maintain UI responsiveness in the GUI 2) multiple CPUs shorten a time-consuming calculation of the events needed to increase scalability and throughput
Thread pool
    • Each CLR is a thread pool that is shared by all the AppDomain controlled by the CLR
    • If a process loads multiple CLR, each CLR has its own thread pool
    • The thread pool does not destroy itself, so no additional performance loss is incurred. When a thread pool is idle for a while, threads wake up and terminate themselves to release resources. Idle at this time, so performance loss has little impact
    • Thread pool internally owns worker thread or IO thread
    • APM Asynchronous programming model issues IO requests such as files, networks, databases, Web services, or other hardware devices
Execution context
    • A data structure associated with a thread that contains security settings (compressed stacks, principal and Windows identities), host settings, and logical call context (CALLCONTEXT) data
    • The initial thread execution context should flow to (copy to) the worker thread. This ensures that the worker thread performs the same security settings and host settings as the operation. It also ensures that the logical call context of the initial thread can be used in the worker thread. (This has a certain effect on performance)
    • You can use the ExecutionContext method to prevent context flow and improve the performance of your application. This is also true in Apm,task.
Support for collaborative cancellation
    • For long-used threads, "Cancel" should be supported
    • CancellationTokenSource. The caller applies the CTS. Cancel method, inside the thread to determine the IsCancellationRequested property
    • If cancellation is not allowed, the Cancellationtoken.none property returns an object that is passed, and the Canbecanceled property returns false
    • The Cancel object supports the register callback method. The synchronous invocation (the synchronization context is true,send) is called sequentially, the first callback exception terminates, otherwise the callback method is called in post, and the unhandled exception is added to a collection. The Aggregateexception,innerexceptions property is a collection of exceptions. Cts.  Token.register (() => ...) Cts. Cancel () .... Support for canceling connection merging. Collaborative cancellation
Task
  • Compared to threadpool asynchronous operations, there is no mechanism for callback notifications. Easier to do with tasks
  • New Task (action, para). Start (); T.wait () explicitly waits for T.result to get the result (the result property calls wait inside)
  • If an exception is thrown, it is "embezzled" and stored in a collection, and AggregateException is thrown when a wait or result is called (handle indicates whether the exception is handled)
  • WaitAny and WaitAll waiting for multiple tasks
  • Detects an exception that is not noticed by the task, and registers a callback with the static Unobservedtaskexception event for TaskScheduler. When a task is reclaimed by GC, the CLR's finalizer raises this event if there are exceptions that are not noticed. You can call the Setobserved method to indicate and handle the exception, thereby preventing the CLR from terminating the process (considering whether to register the event and handle it as appropriate)
  • Task chained operation, ContinueWith can be a good non-blocking callback
  • Tasks support parent/child relationships
  • Internal construction of task
      1. Int32 's TaskID
      2. Task Execution State
      3. A reference to the parent task
      4. To specify a reference to Taskschedule when a task is created
      5. A reference to the callback method
      6. A reference to the object to pass the callback method to (AsyncState)
      7. A reference to a ExecutionContext
      8. A reference to a ManualResetEventSlim
      9. References to other supplemental states (Cancellationtoken,continuwithtask, etc.)
  • So task provides more convenient and rich asynchronous operations by sacrificing some performance
  • The dispose of a task is primarily to close the ManualResetEventSlim object
  • Find your own tasks in the IDE through the parallel tasks or parallel Stacks window
  • The life cycle of a task
      • Construct task for the first time with a state of created
      • After you start a task, the status changes to Waitingtorun
      • The status changes to running after the actual thread is running
      • The task stops waiting for a subtask with a status of Waitingforchildrentocomplete
      • At the end of the task, the State may be: Runtocompletion, canceled, or faulted
  • When the task finishes, gets the result through the results property, gets the exception by exception when the task fails
  • Provides simplified properties, IsCanceled, isfaulted, iscompleted (this is true when canceled or in error)
  • If the task executes successfully, it should use task. Status = = Taskstatus.runtocompletion
  • If a task starts dispatching automatically by calling Continuwith,continuewhenall,continuewhenany or FromAsync with a status of Waitingforactivation
  • To share the status through the task Factory TaskFactory, the code is as follows:
static void Main (string[] args) {Task parent = new Task (() = {var cts = new CancellationTokenSource (); var tf = new Taskfactory<int> (CTS.            Token, Taskcreationoptions.attachedtoparent, taskcontinuationoptions.executesynchronously,        Taskscheduler.default); Create and start 3 sub-tasks var childtasks = new[]{TF. StartNew (() =>sum (CTS. token,10000)), TF. StartNew (() =>sum (CTS. token,20000)), TF. StartNew (() =>sum (CTS. Token,int.        MaxValue))}; The task subtask throws an exception and cancels the remaining subtasks for (int task = 0; task < childtasks.length; task++) {Childtasks[task]. ContinueWith (t = cts.        Cancel (), taskcontinuationoptions.onlyonfaulted); }//After all subtasks are complete, the task that never fails/is canceled gets the maximum value returned//and then passes the maximum value to another task to explicitly maximize the result of the TF. Continuewhenall (childtasks, completiontasks = completiontasks.where (t =!t.isfaulted &A mp;&!t.iscanceled). Max(t = t.result), cancellationtoken.none). ContinueWith (t = Console.WriteLine ("The Maximum is:" + t.result), Taskcontinuationoptions.executesynch    ronously);    }); When the child task finishes, the exception parent is also explicitly not handled by the task.  ContinueWith (P = = {StringBuilder sb = new StringBuilder ("The following exception (s) occured:" +        Environment.NewLine); foreach (Var e in P.exception.flatten (). innerexceptions) {sb. Appendline ("" + E.gettype ().        ToString ()); } Console.WriteLine (sb.)    ToString ());    }, taskcontinuationoptions.onlyonfaulted); Start the parent task so that it starts its subtask parent.    Start (); Console.read ();}    static int sum (CancellationToken token, int number) {int sum = 0; for (; number > 0; number--) {token.        Throwifcancellationrequested ();    Checked {sum + = number;} } return sum;}
Task Scheduler

Responsible for the task of executing the dispatch, providing the basic framework for the task

    • FCL provides two types derived from Taskschedule, thread pool Task Scheduler and synchronization Context Task Scheduler
    • The Thread Pool Task Scheduler (Taskschedule.default) is used by default
    • The synchronization Context Task Scheduler is typically used for the presentation presentation layer, which dispatches all tasks to the application's GUI thread, so that all task code can successfully update the UI components, such as buttons and controls, and the synchronization context Task Scheduler does not apply to the thread pool at all. A reference to a synchronization context Task Scheduler can be obtained through the Taskschedule.fromcurrentsychronizationcontext method
    • Free to download additional Task Scheduler for research and use, http://code.msdn.microsoft.com/ParExtSamples
      • Iotaskschedule the task queue to the thread pool's IO thread, not its worker thread
      • Limitedconcurrencyleveltaskschedule This Task Scheduler does not allow more than a certain number of tasks to execute at the same time
      • Orderedtaskscheduler allows only one task to be executed at a time, derived from the above scheduler, passing 1 for N
      • Priortizingtaskscheduler dispatches a task to the CLR's thread pool, which can then control the line threads for normal precedence
      • Threadpertaskscheduler This task scheduler creates and launches a separate thread for each task without using the thread pool at all
Simplified concurrent Programming

System.Threading.Tasks.Parallel encapsulates the use of tasks and executes work items in parallel

    • Parallel.For (0,1000,i=>dowork (i));
    • Parallel.ForEach (collection, Item=>dowork (item));
    • Parallel.Invoke (() =>method1 (), () =>method2 ());
    • The For method is faster than the Foreach method
    • Allows you to specify the configuration, paralleloptions, including cancellation, maximum work, Task Scheduler
    • You can allow the delivery of local initialization delegates, principal delegates, task local finalization delegates
    • Both the for and foreach methods return a Parallelloopresult instance, including whether the loop is complete, etc.
Plinq
    • Spread the processing of data items in a collection across multiple CPUs for concurrent processing of multiple data items
    • The static System.Linq.ParallelEnumerable class implements all of PLINQ's functionality, exposing all parallel versions that identify LINQ operators, all of which are system.linq.parallelquery<t> The extension method for the type. To turn a sequential query into a parallel query, you only need to call ParallelEnumerable's AsParallel extension method, and if you do reverse, use the Assequential method
    • If you need to traverse the results of a query in parallel, use the ParallelEnumerable ForAll method
    • If you need to make PLINQ concurrent calls persist the order of data items, you can call the Asordered method of ParallelEnumerable, and vice versa is the Asunordered method
    • Others: Withcancellation,withdegreeofparallem
    • Parallel LINQ finishes processing data items merge back to call Withmergeoptions, passing flag bits specifying buffering and Merging methods
Timer

System.Threading.Timer

      • Internally, the thread pool uses only one of the threads for all timer objects. This thread knows when the next timer object expires and when the timer object expires, the thread wakes up to call ThreadPool.QueueUserWorkItem internally, adding a work item to the queue of the thread pool
      • If the callback method executes too long, the timer may fire again, causing the thread pool to allocate multiple threads to execute the callback at the same time. To solve this problem, you can specify timeout.infinite for the period parameter when you construct the timer. This timer is only fired once, then the change method is called in the callback method to specify a new duetime, and the period is specified again for the Timeout.infinite
      • The Timer class provides a Dispose method that allows the timer to be completely canceled and can signal the kernel object identified by the Notifyobject parameter after all callbacks that were in the pending state were completed.
      • When a timer object is garbage collected, the finalizer code tells the thread pool to cancel the timer so that he no longer fires. So when using a timer, it is better to define a variable to keep the Timer object alive, otherwise the call to the callback method will stop
      • After testing, when the timer is disposed terminated, it executes the callback method specified by the timer once
      • Other timers
        • System.WIndows.Forms.Timer, which associates a timer with the calling thread, when the timer fires, WIndows injects a timer message into the thread's message queue, and the thread executes a message pump to extract the message. and send them to the callback method that needs to be called. All work is done only by one thread, and the thread that sets the timer is the thread that executes the callback method. So the timer method is not executed concurrently by multiple threads
        • The System.WIndows.Threading.DispatcherTimer class, like the timer for forms, is equivalent
        • System.Timers.Timer, the wrapper for System.Threading.Timer, when the timer expires, causes the CLR to place the event into the queue of the thread pool. Typically used for drag-and-drop controls on the interface, not recommended
Use of the thread pool
    • Default maximum of 1000 threads
    • 32-bit system up to 2GB of available address space, load Win32 and CLR DLLs, allocate local heap and managed heap, remaining 1.5G, each thread consumes more than 1MB, up to almost 1360 threads, out of the OutOfMemoryException
    • The 64-bit process provides a 8TB address space, which can theoretically create millions threads.
    • JR recommends that you do not invoke the static method of the thread pool
    • The default value for MinThreads is the number of CPUs allowed by the process, and usually your process allows all CPUs on the machine to be used, so the number of worker threads created by the thread pool will quickly reach the machine's CPU count,
Tips
    • When garbage collection is performed, the CLR must suspend (pause) so threads, traverse their stacks to find the root side to mark the objects in the heap (the first stage), traverse their stacks again, and then restore all the threads. So reducing the number of threads also increases the performance of the garbage collector
    • Jefferey seems to be a small spit slot on the design of Windows Notepad, hehe
    • Threads are "expensive" and require rational use of them
    • Treat CLR threads and window threads differently, future trends, simple coding, improved performance
    • P/invoke local thread, System.Threading.Thread.BeginThreadAffinity, and Thread.endtheradaffinity methods to notify the CLR
    • You cannot guarantee that your thread is running, and you cannot prevent other threads from running. Because the thread context switches
    • 0 Priority threads are 0-page threads, and when no other process is cleared, all free pages of system RAM are zeroed
    • The CLR's finalizer thread runs with time-critical priority
    • When all foreground threads in a process stop running, the CLR forces the termination of any post-antenna running. immediate termination, no exception thrown
    • All foreground threads terminate, all applications exit, and the entire process can be destroyed.
    • Try to avoid using foreground threads
    • Cache lines and pseudo-shares, I mean, Jr is familiar with the Windows kernel, I don't know, record the worship

Just as a note, this article needs to know more about thread-related things, and can continue to refer to:

Http://msdn.microsoft.com/en-us/library/orm-9780596527570-03-19.aspx

Discussion on thread pool (top): The role of thread pool and CLR thread pool

Discussion on thread pool (middle): function of independent thread pool and IO thread pool

Talking about the thread pool (bottom): Related Test and matters needing attention

Reading notes-clr via C # thread 25-26 Chapter

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.