Parallel LINQ
1 System.Linq . ParallelEnumerable
An overview of important methods:
1) public static parallelquery<tsource> asparallel<tsource> (this ienumerable<tsource> source); enable parallelization of queries
2) public static parallelquery<tsource> asordered<tsource> (this parallelquery<tsource> source); Enables processing that treats the data source as "sorted", overriding the default processing method that treats the data source as "unsorted." AsOrdered can only be called on generic sequences returned by AsParallel, ParallelEnumerable.Range, and ParallelEnumerable.Repeat.
3) public static parallelquery<tsource> withexecutionmode<tsource> (This parallelquery<tsource> Source, Parallelexecutionmode ExecutionMode); Set execution mode for queries
4) public static double Average (this parallelquery<double> source); calculate the series average
5) public static decimal? Max (this parallelquery<decimal?> source); calculating the maximum value of a series
6) public static decimal? Min (this parallelquery<decimal?> source); calculates the minimum value in a sequence
7) public static decimal? Sum (this parallelquery<decimal?> source); sum
8) public static TResult Aggregate<tsource, Taccumulate, tresult> (this parallelquery<tsource> source, Taccumulate seed, Func<taccumulate, TSource, taccumulate> Func, Func<taccumulate, tresult> resultSelector) ; The accumulator function is applied in parallel to a sequence. Uses the specified seed value as the initial value of the accumulator and selects the result value using the specified function.
9) public static parallelquery<tsource> withcancellation<tsource> (This parallelquery<tsource> Source, CancellationToken CancellationToken); Sets the System.Threading.CancellationToken to associate with the query.
public static parallelquery<tsource> withdegreeofparallelism<tsource> (This Parallelquery<tsource > source, int degreeofparallelism); Sets the degree of parallelism to use in the query.
public static void forall<tsource> (this parallelquery<tsource> source, action<tsource> Action); Each element in source invokes the specified operation in parallel.
public static parallelquery<tsource> withmergeoptions<tsource> (This parallelquery<tsource> Source, Parallelmergeoptions Mergeoptions); Sets the merge option for this query, which specifies how the query buffers the output.
Description
1) PLINQ implements all of the LINQ operators and adds some parallel operators.
2) PLINQ can be used either for concurrent collections or for traditional collections .
3) By default, when PLINQ is executed. NET avoids high overhead parallelization algorithms, and if you want to enforce parallel execution, you can use Parallelexecutionmode.forceparallelism.
4) Depending on the number of cores available,PLINQ decomposes the accepted data source into multiple copies and processes each copy on a different kernel. And there is no fixed order for the execution of each copy.
5) The PLINQ query has the effect of delaying execution, so it is necessary to capture the exception that results from the query produced when consumed by the consumer.
6) One of the overloaded methods of Aggregate can partition the data source sequence into several sub-sequences (partitions). performs a updateaccumulatorfunc on each element within a partition , resulting in a single cumulative result per partition. The combineAccumulatorsFunc is then called on the results of each partition to produce an element. Finally, the elements produced by combineAccumulatorsFunc are transformed by the Resultselector function to obtain the final result.
2 Using the example
definition List<t> list = ..., condition in code is a filter condition.
1) sort
1 //ensure that the data output sequence obtained by the operation is consistent with the original set2 varRESLINQ1 = fromIteminchlist. AsParallel (). AsOrdered ()3 where(condition)4 Selectitem;5 varRes1 = list. AsParallel (). AsOrdered (). Where (m=>m>4);6 //use ascending in ascending order, use descending in descending order7 varRESLINQ2 = fromIteminchlist. AsParallel ()8 where(condition)9 byItem DescendingTen Selectitem; One A //ascending order using order BY, descending order using orderbydescending - varRes2 = list. AsParallel (). Where (item =>condition). OrderByDescending (m = + m);
2) Set the execution mode of the query
1 //force parallelization of the entire query2 varRESLINQ3 = fromIteminchlist. AsParallel ().3 Withexecutionmode (parallelexecutionmode.forceparallelism)4 where(condition)5 byItem Descending6 Selectitem;7 8 varRes3 =list. AsParallel ().9 Withexecutionmode (parallelexecutionmode.forceparallelism).TenWhere (condition). OrderByDescending (m = + m);
3) Protocol operation
Suppose the element in the list here is a number.
1 varResLinq4 = ( fromIteminchlist. AsParallel ()2 where(condition)3 Selectitem). Average ();4 varRES4 = list. AsParallel (). Where (item =condition). Average ();5 6 varResLinq5 = ( fromIteminchlist. AsParallel ()7 where(condition)8 Selectitem). Max ();9 varRES5 = list. AsParallel (). Where (item = condition). Max ();
4) Custom aggregation function
Suppose the element in the list here is a number.
1 //Variance Calculation formula: S2 = ((x1-a) (x2-a) 2+...+ (XN-A) 2)/n, where a is the average, N is the number of elements in the sequence, and Xi is the first element in the sequence2 //sum sum part of the result, item: Sets the element in the list, result: the calculated variance value. 3 varAverage =list. AsParallel (). Average ();4 varRES6 =list. AsParallel (). Aggregate (5 0d,6(sum, item) = SUM + Math.pow ((item-average),2),7(Result) =>result/list. Count8 );9 //same as above, just a lot more combineaccumulatorsfunc functionTen varRes7 =list. AsParallel (). Aggregate ( One 0d, A(sum, item) = SUM + Math.pow ((item-average),2), -(Total,thistask) =>total+Thistask, -(result) = Result/list. Count
5) Cancel parallel operation
1CancellationTokenSource cts =NewCancellationTokenSource ();2CancellationToken ct =CTS. Token; 3 Cancelparallel (ct,list);4 5 Private Static voidCancelparallel (CancellationToken ct,list<int>list)6 {7 //conditionexec The parallel operation is canceled if this condition is true8 if(conditionexec)9 {Ten Ct. Throwifcancellationrequested (); One } A varAverage =list. AsParallel (). Average (); - list. AsParallel (). WithCancellation (CT). Aggregate ( - 0d, the(sum, item) = SUM + Math.pow ((item-average),2), -(Total, thistask) = Total +Thistask, -(Result) = result/list. Count - ); +}
6) Specify the degree of parallelism
1 int maxdegreeofparallelism = environment.processorcount; 2 var res8 = list. AsParallel (). Withdegreeofparallelism (maxdegreeofparallelism). Aggregate (3 0d,4 2),5 (Result) = > Result/ list. Count6 );
7) Use ForAll
1 New Concurrentbag<t>(); 2 3 {4 // Add as a collection after the element is processed 5 bag. ADD (itemafter); 6 });
8) Exception Handling
Use AggregateException handles exceptions, see the Tasks.parallel section for a specific example.
Thread pool
1 CLR 4 thread pool engine and Threads
- The CLR thread pool engine manages a pool of threads that can handle work items. The thread pool engine creates additional idle threads at intervals, and these idle threads take the work items out of the queue in a FIFO order and begin to execute the work items.
- The CLR thread pool engine creates a managed thread that requires thousands of CPU cycles and consumes memory.
- The CLR thread pool engine maintains a minimum number of idle worker threads, usually equal to the number of logical cores.
- The CLR thread pool engine manages all background threads, that is, all foreground threads exit, and the background thread does not keep the application running.
2 Global and local queues
- use using when TPL creates a task, a new work item is added to the thread pool global queue, and when all available worker threads in the thread pool are executing work items, the new join thread pool Global queue's work must wait until a work item is available.
- each thread in the thread pool that is assigned to a task has its own local queue, which reduces contention on the global queue. Local pairs of columns typically extract tasks in LIFO order and execute them.
3 Threading.threadpool
compared to using a task to join a work item to a queue, create a Task instance has some overhead, but it can take advantage of some cancellation tokens, and so on.
use the QueueUserWorkItem method to join the task to the queue.
ThreadPool.QueueUserWorkItem (state) + = {// specific business });
Maximum number of worker threads in the workerthreads thread pool , maximum number of asynchronous I/O threads in the completionportthreads thread pool
Threadpool.getmaxthreads (out workerthreads, out completionportthreads);
WorkerThreads the minimum number of worker threads created by the thread pool as needed
completionPortThreads the minimum number of asynchronous I/o threads created by the thread pool as needed
Threadpool.getminthreads (out workerthreads, out completionportthreads);
-----------------------------------------------------------------------------------------
Reprint and quote please specify the source.
Time haste, the level is limited, if has the improper place, welcome correction.
. NET multithreaded programming-parallel LINQ, thread pool