Parallel Programming in. Net-4. High-Performance asynchronous queue

Source: Internet
Author: User

Parallel Programming in. Net-4. High-Performance asynchronous queue
This queue is mainly used to process real-time data streams and simplify the multi-threaded programming model. When designing this queue, consider the following requirements (the requirements come from a company's actual project): 1. Support for multi-thread team-up and try to simplify the complexity of multi-thread programming. 2. Supports the event trigger mechanism. data is processed only when the data is queued, rather than the scheduled processing mechanism, and the consumer thread can be blocked internally. 3. Ensure that the data processing sequence is consistent with that of the team. 4. Strong fault tolerance and uninterrupted operation. Solutions for the above requirements: 1. concurrentQueue supports multithreading and features high performance in a multi-threaded environment. For a multi-threaded programming model that simplifies the use of the adapter mode, consumer threads can be encapsulated into the queue, internal processing of events to process user tasks. 2. The event triggering mechanism is not suitable for semaphores first, because the threads will be blocked when the semaphores reach a specified number, so this part needs to be programmed by yourself (For details, refer to the source code ). 3. Queue features and ensure the order in which data items are processed by threads. 4. The exception handling function can be registered to solve the exception problem. Therefore, the following code was developed: copy the code public class AsynQueue <T> {// whether the queue is Processing data private int isProcessing; // a thread is Processing data private const int Processing = 1; // no thread processes data private const int UnProcessing = 0; // whether the queue is available private volatile bool enabled = true; private Task currentTask; public event Action <T> ProcessItemFunction; public event EventHandler <EventArgs <Exception> ProcessException; private ConcurrentQueue <T> queue; public AsynQueue () {queue = new ConcurrentQueue <T> (); Start ();} public int Count {get {return queue. count ;}} private void Start () {Thread process_Thread = new Thread (PorcessItem); process_Thread.IsBackground = true; process_Thread.Start ();} public void Enqueue (T items) {if (items = null) {throw new ArgumentException ("items");} queue. enqueue (items); DataAdded () ;}// notify the consumer thread to process private vo after data is added. Id DataAdded () {if (enabled) {if (! IsProcessingItem () {currentTask = Task. Factory. StartNew (ProcessItemLoop) ;}}// determine whether a thread in the queue is processing private bool IsProcessingItem () {return! (Interlocked. CompareExchange (ref isProcessing, Processing, UnProcessing) = 0);} private void ProcessItemLoop () {if (! Enabled & queue. isEmpty) {Interlocked. exchange (ref isProcessing, 0); return;} // whether the number of threads processed is smaller than the current maximum number of tasks // if (Thread. volatileRead (ref runingCore) <= this. maxTaskCount) // {T publishFrame; if (queue. tryDequeue (out publishFrame) {try {ProcessItemFunction (publishFrame);} catch (Exception ex) {OnProcessException (ex) ;}} if (enabled &&! Queue. isEmpty) {currentTask = Task. factory. startNew (ProcessItemLoop);} else {Interlocked. exchange (ref isProcessing, UnProcessing );}} /// <summary> /// scheduled processing thread call function /// mainly monitors the status of threads not being sent and processed when the queue is in progress. /// </summary> private void porcessItem (object state) {int sleepCount = 0; int sleepTime = 1000; while (enabled) {// if the queue is empty, the sleep time is determined based on the number of cycles if (queue. isEmpty) {if (sleepCount = 0) {sleepTime = 1000;} else If (sleepCount <= 3) {sleepTime = 1000*3;} else {sleepTime = 1000*50;} sleepCount ++; Thread. sleep (sleepTime);} else {// determines whether a thread in the queue is processing if (enabled & Interlocked. compareExchange (ref isProcessing, Processing, UnProcessing) = 0) {if (! Queue. isEmpty) {currentTask = Task. factory. startNew (ProcessItemLoop);} else {Interlocked. exchange (ref isProcessing, 0) ;}sleepcount = 0; sleepTime = 1000 ;}}} public void Flsuh () {Stop (); if (currentTask! = Null) {currentTask. Wait ();} while (! Queue. isEmpty) {try {T publishFrame; if (queue. tryDequeue (out publishFrame) {ProcessItemFunction (publishFrame) ;}} catch (Exception ex) {OnProcessException (ex) ;}} currentTask = null;} public void Stop () {this. enabled = false;} private void OnProcessException (System. exception ex) {var tempException = ProcessException; Interlocked. compareExchange (ref ProcessException, null, null); if (tem PException! = Null) {ProcessException (ex, new EventArgs <Exception> (ex) ;}} the idea of copying code to the queue is: when every time data enters the queue, the queue calls DataAdded () to determine whether the data item has been processed. If the data has been processed, it is returned directly after it is imported into the internal queue. Otherwise, the consumer thread is enabled for processing. The consumer thread (thread pool) inside the queue (the Task is implemented using the thread pool internally. Here we will use the thread pool) to process data recursively, that is, after the current data processing is completed, another data is placed in the thread pool for processing. This forms a processing ring and ensures that each piece of data is processed in an orderly manner. Because the IsEmpty of ConcurrentQueue is only a snapshot of the current memory, it may be null at the current time and not empty at the next time, therefore, you also need a daemon thread process_Thread to regularly monitor whether the consumer thread (thread pool) inside the queue is processing data, otherwise, the consumer thread determines that the queue is empty and the data has arrived, but the queue has not been inserted. At this time, the data may never be processed. Applicable scenarios: 1. Suitable for scenarios where multiple producers and consumers are the same (currently, multiple consumers can be implemented using multiple separate threads if multiple consumers are required ). 2. suitable for scenarios with fast data processing speeds, but not for IO operations such as file writing, because the thread pool contains background threads, when the process is closed, the thread closes the thread at the same time, and the file may not be written to the disk. 3. It is suitable for the basic data structure of the pipeline processing model. Queues communicate with each other through their respective event processing functions (I will write a special article to introduce the application of the pipeline model in the future ). Note: Internal ConcurrentQueue queues can also be replaced by BlockingCollection. Although blocking queues are simpler, internal consumer threads are more suitable for separate threads and are not suitable for thread pools, when the blocking queue is empty, the consumer thread will be blocked. Of course, blocking the threads in the thread pool has no effect, but we do not recommend this operation, and the performance of the blocked queue is not as high as that of ConcurrentQueue.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.