The above. NET of parallel programming -3.concurrentqueue implementation and analysis of CONCURRENTQUEUE implementation, this chapter based on Concurrentqueue to achieve a high-performance asynchronous queue, This queue is primarily used for the processing of real-time data streams and simplifies the multithreaded programming model. Consider the following requirements when designing this queue (demand comes from a company's actual project):
1. Support multi-threaded queue out team, as far as possible to simplify the complexity of multithreaded programming.
2. Support the event triggering mechanism, the data is queued for processing instead of using the timing mechanism, and the internal can block the consumer thread.
3. The order of data processing at the time of the team is guaranteed to be consistent with the queue.
4. Strong fault tolerance, can be run uninterrupted.
Solutions for the above demand points:
1.ConcurrentQueue supports multithreading and high performance in multi-threaded environments, simplifying the available adapter mode for multithreaded programming models encapsulates consumer threads inside a queue and handles user tasks internally with handling events.
2. For the event-triggering mechanism, the semaphore does not fit first, because the semaphore reaches a specified number of blocking threads, so the part needs to be programmed to implement (specific reference source).
3. The characteristics of the queue and the order in which they are queued and out, the order in which the threads process data items needs to be ensured.
4. You can resolve an exception by registering the exception handler function.
So the following code is developed:
Public classAsynqueue<t> { //whether the queue is processing data Private intisprocessing; //Wired is impersonating to process data Private Const intProcessing =1; //no thread processing data Private Const intUnprocessing =0; //whether the queue is available Private volatile BOOLEnabled =true; PrivateTask Currenttask; Public EventAction<t>processitemfunction; Public EventEventhandler<eventargs<exception>>processexception; PrivateConcurrentqueue<t>queue; PublicAsynqueue () {Queue=NewConcurrentqueue<t>(); Start (); } Public intCount {Get { returnqueue. Count; } } Private voidStart () {Thread Process_thread=NewThread (Porcessitem); Process_thread.isbackground=true; Process_thread.start (); } Public voidEnqueue (T items) {if(Items = =NULL) { Throw NewArgumentException ("Items"); } queue. Enqueue (items); Dataadded (); } //notifies consumer thread processing when data is added Private voiddataadded () {if(enabled) {if(!Isprocessingitem ()) {Currenttask=Task.Factory.StartNew (Processitemloop); } } } //determine if the queue is wired is impersonating in the processing Private BOOLIsprocessingitem () {return! (Interlocked.compareexchange (refisprocessing, processing, unprocessing) = =0); } Private voidProcessitemloop () {if(!enabled &&queue. IsEmpty) {Interlocked.exchange (refIsprocessing,0); return; } //whether the number of threads processed is less than the current maximum number of tasks//if (Thread.volatileread (ref Runingcore) <= this. Maxtaskcount)//{T Publishframe; if(Queue. Trydequeue ( outpublishframe)) { Try{processitemfunction (publishframe); } Catch(Exception ex) {onprocessexception (ex); } } if(Enabled &&!)queue. IsEmpty) {Currenttask=Task.Factory.StartNew (Processitemloop); } Else{Interlocked.exchange (refisprocessing, unprocessing); } } /// <summary> ///timed processing thread call function///The main is to monitor the queue when the thread did not come and deal with the situation/// </summary> Private voidPorcessitem (ObjectState ) { intSleepcount =0; intSleeptime = +; while(enabled) {//determines the time of sleep based on the number of loops if the queue is empty if(queue. IsEmpty) {if(Sleepcount = =0) {Sleeptime= +; } Else if(Sleepcount <=3) {Sleeptime= +*3; } Else{sleeptime= +* -; } Sleepcount++; Thread.Sleep (Sleeptime); } Else { //determine if the queue is wired is impersonating in the processing if(Enabled && Interlocked.compareexchange (refisprocessing, processing, unprocessing) = =0) { if(!queue. IsEmpty) {Currenttask=Task.Factory.StartNew (Processitemloop); } Else{Interlocked.exchange (refIsprocessing,0); } Sleepcount=0; Sleeptime= +; } } } } Public voidFlsuh () {Stop (); if(Currenttask! =NULL) {currenttask.wait (); } while(!queue. IsEmpty) {Try{T publishframe; if(Queue. Trydequeue ( outpublishframe)) {processitemfunction (publishframe); } } Catch(Exception ex) {onprocessexception (ex); }} currenttask=NULL; } Public voidStop () { This. Enabled =false; } Private voidonprocessexception (System.Exception ex) {varTempexception =processexception; Interlocked.compareexchange (refProcessException,NULL,NULL); if(Tempexception! =NULL) {processexception (ex,NewEventargs<exception>(ex)); } } }
the idea of the queue is that when each data is queued, dataadded () is called inside the queue to determine if the data item has started to be processed, and if it has already started processing, then the data is returned to the internal queue and the consumer threading process is turned on. The consumer thread inside the queue (the thread pool is implemented inside a task using a threading sink, which is used as a thread-pooling) to process the data recursively, that is, after the current data processing is completed and then another data is put to the thread pool to process, This creates a processing ring and ensures that each piece of data is processed in an orderly manner. Since Concurrentqueue's isempty is just a snapshot state of the current memory, it is possible that the current moment is empty for the next time, so a daemon thread is also required process_thread timing to monitor whether the consumer thread (thread pool) inside the queue is processing data. Otherwise, the consumer thread has already judged that the queue is empty and the data has arrived just not yet inserted into the queue at which point the data may never be processed.
The applicable scenario:
1. Suitable for multiple producers of a consumer scenario (currently multiple consumers can use multiple individual threads if required).
2. Suitable for processing data faster scenarios and IO operations such as file writes are not appropriate, because the thread pool is inside a background thread, and when the process shuts down, the threads close the thread at the same time the file may not have been written to disk.
3. Suitable as the basic data structure of the pipeline processing model, communication between queues through their respective event handlers (the following will specifically write an article about the pipeline model of the application).
Note: The internal concurrentqueue queue can also be replaced with a blocking queue (BlockingCollection), although using a blocking queue is simpler but the internal consumer thread is better suited to use a separate thread than the threads pool. And blocking the queue is empty will block the consumer thread, of course, the blocking line Cheng thread is not recommended to do so, and the performance of the blocking queue is not concurrentqueue high performance.
. NET parallel Programming-4. Implementing High-performance asynchronous queues