. NET Parallel Programming 1-

Source: Internet
Author: User

    1. Design Patterns ——. Net parallel programming, a Chinese translation published by Tsinghua University.
      1. Related Resources Address Main page: http://parallelpatterns.codeplex.com/
      2. Code Download: http://parallelpatterns.codeplex.com/releases/view/50473
      3. Book Online address: https://msdn.microsoft.com/en-us/library/ff963553.aspx
      4. Some examples of using parallel programming: Https://code.msdn.microsoft.com/ParExtSamples
      5. Introduction page for Task: Https://msdn.microsoft.com/en-us/library/system.threading.tasks.task (v=vs.110). aspx

This book introduces some patterns of multithreaded programming, that is, scenarios that use multithreading, and what technologies in. NET can be implemented-primarily, TPL (Task parallel Library) and PLINQ (parallel LINQ). TPL is a feature added in the. NET Framework 4 to encapsulate the previous concepts of thread and synchronization, thread pool, etc. As long as you use task and System.Threading.Tasks.Parallel, provide parallel tasks, parallel and synchronous details to the library processing. PLINQ is a parallel version of LINQ to Objects. Here is an overview of these structures.

    1. Pattern classification
      1. Data parallelism in parallel. Perform the same calculations on different data-for example, calculations in a for loop-that are data parallelism.

The included patterns are parallel loops (parallel loops) and parallel aggregations (parallel aggregation). Parallel loops emphasize that there is no data dependency between parallel loops, no control of parallel order, parallel aggregation is similar to map-reduce, tasks have concurrent parts, but there is also the need to control synchronization, and. NET provides libraries that encapsulate synchronization in a very good way, using it as simply as not requiring synchronization.

    1. Tasks parallel task parallelism. Emphasizes that the tasks performed in parallel are different, and the input data for the general task is different.

The included models have parallel tasks parallel tasks, future mode, dynamic task parallelism (dynamically task parallelism), and pipelining pipelines.

    1. Parallel loops. When you need to perform the same independent operation on each element in the collection, you can use the parallel loop pattern, and note that loops need to be independent of each other.
      1. General situation. For example we may have such a for loop

int n = ...

for (int i = 0; i < n; i++)

{

Do some task

}

The corresponding parallel version:

int n = ...

Parallel.For (0, n, i =

{

// ...

});

foreach also has a corresponding parallel version

Ienumerable<myobject> myenumerable = ...

foreach (Var obj in myenumerable)

{

// ...

}

Ienumerable<myobject> myenumerable = ...

Parallel.ForEach (myenumerable, obj =

{

// ...

});

PLINQ Multi-threading Example

Ienumerable<myobject> Source = ...

Linq

var query1 = from I in source select Normalize (i);

Plinq

var query2 = from I in source. AsParallel ()

Select Normalize (i);

    1. Control the loop process. You can control other parallel tasks during task execution, and you can use Break,stop and cancel. General stop and cancel use are more common.
      1. Interrupt loop break. A break similar to a for loop. Note that after break, tasks that are smaller than the index value of the break task are still performed, and all tasks with small index values are guaranteed to execute. If a task with a large index value starts executing before the break, it will also execute. This applies to situations where the task order is dependent and requires that the task be completed before the break.

int n = ...

for (int i = 0; i < n; i++)

{

// ...

if (/* Stopping condition is true */)

Break

}

You can exit a loop this way when you are using a multithreaded version:

int n = ...

Parallel.For (0, N, (i, loopstate) =

{

// ...

if (/* Stopping condition is true */)

{

Loopstate.break ();

Return

}

});

Signature:

Parallel.For (int frominclusive,

int toexclusive,

Action<int, parallelloopstate> body);

Check whether the task state is an interrupt exit method:

int n = ...

var result = new Double[n];

var loopresult = parallel.for (0, N, (i, loopstate) = =

{

if (/* Break condition is true */)

{

Loopstate.break ();

Return

}

Result[i] = DoWork (i);

});

if (!loopresult.iscompleted &&

LoopResult.LowestBreakIteration.HasValue)

{

Console.WriteLine ("Loop encountered a" at {0} ",

LoopResult.LowestBreakIteration.Value);

}

    1. Stop stops for interrupts. Similar to break, the difference is that stop will no longer perform the task of small index values, that is, executing the execution is completed, the other will not be executed. There is absolutely no dependency between tasks, so long as there is a task stop, the remaining tasks are no longer dispatched.

var n = ...

var loopresult = parallel.for (0, N, (i, loopstate) = =

{

if (/* Stopping condition is true */)

{

Loopstate.stop ();

Return

}

Result[i] = DoWork (i);

});

if (!loopresult.iscompleted &&

!loopresult.lowestbreakiteration.hasvalue)

{

Console.WriteLine ("Loop was stopped");

}

    1. External loop cancellation. For some tasks that take a long time to execute, you can use the cancel operation. Checks whether the identity is canceled during task execution.

void Doloop (CancellationTokenSource cts)

{

int n = ...

CancellationToken token = cts. Token;

var options = new ParallelOptions

{CancellationToken = token};

Try

{

Parallel.For (0, N, Options, (i) = =

{

// ...

... optionally check to see if cancellation happened

if (token. iscancellationrequested)

{

... optionally exit this iteration early

Return

}

});

}

catch (OperationCanceledException ex)

{

... handle the loop cancellation

}

}

Function Signature:

Parallel.For (int frominclusive,

int toexclusive,

ParallelOptions ParallelOptions,

Action<int> body);

Question: If you are using Cancel, will the remaining tasks be scheduled for execution?

    1. Exception handling. If there is an exception thrown in a task, the new task is no longer dispatched, and the scheduled execution completes. Finally, all possible exceptions to the task will be packaged and thrown out in an exception aggregationexception.
    2. Small loop bodies are executed in batches. Some loop body execution time is less, if each cycle is dispatched a task, obviously outweigh the gains. You can talk about loop execution process partitioning, such as scheduling every 100 loops. The following example automatically assigns the number of tasks per batch based on the number of cores of the CPU.

int n = ...

Double[] result = new Double[n];

Parallel.ForEach (partitioner.create (0, N),

(range) =

{

for (int i = range. Item1; I < range. ITEM2; i++)

{

Very small, equally sized blocks of work

Result[i] = (double) (i * i);

}

});

Function Signature:

Parallel.foreach<tsource> (

Partitioner<tsource> Source,

Action<tsource> body);

The following settings perform 50,000 loops per task.

Double[] result = new double[1000000];
Parallel.ForEach (partitioner.create (0, 1000000, 50000),
(range) =
{
for (int i = range. Item1; I < range. ITEM2; i++)
{
Small, equally sized blocks of work
Result[i] = (double) (i * i);
}
});

Here System.Collections.Concurrent.Partitioner cut the interval into ienumerable<tuple<int,int>> form.

    1. Controls the degree of parallelism. The general TPL automatically controls the number of simultaneous tasks based on the number of CPU cores, and you can control the maximum number of parallel tasks by Paralleloption Maxdegreeofparallelism.

var n = ...

var options = new ParallelOptions ()

{maxdegreeofparallelism = 2};

Parallel.For (0, N, options, i =

{

// ...

});

Function Signature:

Parallel.For (int frominclusive,

int toexclusive,

ParallelOptions ParallelOptions,

Action<int> body);

PLINQ uses the example:

Ienumerable<t> mycollection =//...

Mycollection.asparallel ()
. Withdegreeofparallelism (8)
. ForAll (obj =/* ... */);

    1. Use the local task state in the loop body.

int numberofsteps = 10000000;

Double[] result = new Double[numberofsteps];

Parallel.ForEach (

Partitioner.create (0, Numberofsteps),

New ParallelOptions (),

() = {return new Random (Makerandomseed ());},

(range, loopstate, random) =

{

for (int i = range. Item1; I < range. ITEM2; i++)

Result[i] = random. Nextdouble ();

return random;

},

_ + = {});

Function Signature:

Foreach<tsource, Tlocal> (

Orderablepartitioner<tsource> Source,

ParallelOptions ParallelOptions,

Func<tlocal> LocalInit,

Func<tsource, ParallelLoopState, tlocal, tlocal> body,

Action<tlocal> localfinally)

    1. Parallel tasks. If there are multiple one-step tasks that can be performed at the same time, you can use parallel task mode. For example

Parallel.Invoke (Doleft, doright);

Equivalent to the following method:

Task T1 = Task.Factory.StartNew (doleft);

Task t2 = Task.Factory.StartNew (doright);

Task.waitall (t1, T2);

    1. Handles exceptions (https://msdn.microsoft.com/en-us/library/dd997415 (v=vs.110). aspx). Use wait and WaitAll to observe the exception thrown by the task, WaitAny not. The received exception is wrapped in AggregateException, and you can use the handle method to handle the exception inside.

Try

{

Task t = Task.Factory.StartNew (...);

// ...

T.wait ();

}

catch (AggregateException AE)

{

Ae. Handle (E =

{

If (e is myexception)

{

... handle exception ...

return true;

}

Else

{

return false;

}

});

}

Due to the possibility of nesting other exceptions, forming a multilevel tree structure, you can use flatten to flatten the tree structure and then call handle to ensure that all the exceptions to the aggregation can be handled.

Try

{

Task T1 = Task.Factory.StartNew (() =

{

Task t2 = Task.Factory.StartNew (() =

{

// ...

throw new MyException ();

});

// ...

T2. Wait ();

});

// ...

T1. Wait ();

}

catch (AggregateException AE)

{

Ae. Flatten (). Handle (E =

{

If (e is myexception)

{

... handle exception ...

return true;

}

Else

{

return false;

}

});

}

    1. Wait for the first task to complete. You can use WaitAny to wait for the first task to complete. Note that WaitAny does not observe an exception, and WaitAll is added to handle the exception.

var taskindex =-1;

task[] tasks = new task[]

{

Task.Factory.StartNew (Doleft),

Task.Factory.StartNew (Doright),

Task.Factory.StartNew (Docenter)

};

Task[] AllTasks = tasks;

Print completion notices one by one as tasks finish.

while (tasks. Length > 0)

{

Taskindex = Task.waitany (tasks);

Console.WriteLine ("Finished task {0}.", Taskindex + 1);

tasks = tasks. Where ((t) = + t! = Tasks[taskindex]). ToArray ();

}

Observe any exceptions that might has occurred.

Try

{

Task.waitall (AllTasks);

}

catch (AggregateException AE)

{

...

}

The following example waits for the first completed task, then cancels the other task, handles the cancel operation exception, and the other exception is re-thrown.

public static void Speculativeinvoke (

params action<cancellationtoken>[] actions)

{

var cts = new CancellationTokenSource ();

var token = cts. Token;

var tasks =

(from a In actions

Select Task.Factory.StartNew (() = A (token), token))

. ToArray ();

Wait for fastest task to complete.

Task.waitany (tasks);

Cancel all of the slower tasks.

Cts. Cancel ();

Wait for cancellation to finish and observe exceptions.

Try

{

Task.waitall (tasks);

}

catch (AggregateException AE)

{

Filter out of the exception caused by cancellation itself.

Ae. Flatten (). Handle (E = e is operationcanceledexception);

}

Finally

{

if (CTS! = NULL) cts. Dispose ();

}

}

    1. Beginners are prone to make mistakes.
      1. The variable problem of the closure capture. Consider this piece of code.

for (int i = 0; i < 4; i++)

{//Warning:buggy CODE, I has unexpected value

Task.Factory.StartNew (() = Console.WriteLine (i));

}

You might want to output a digital 1,2,3,4, just messing up the order. In fact, you're likely to see 4,4,4,4. Because several tasks actually access the same variable i. You can use the following methods to avoid this problem.

for (int i = 0; i < 4; i++)

{

var tmp = i;

Task.Factory.StartNew (() = Console.WriteLine (TMP));

}

    1. The wrong time to clean up the resource required for the task. Consider this piece of code.

Task<string> T;

using (var file = new StringReader ("text"))

{

t = Task<string>. Factory.startnew (() = file. ReadLine ());

}

Warning:buggy CODE, file has been disposed

Console.WriteLine (T.result);

It is likely that the file was already dispose of when the task was executed.

    1. The life cycle of the task.

    1. Task scheduling mechanism. An example of a task Scheduler: "How to:create a task Scheduler that Limits the degree of Concurrency."
    1. Parallel consolidation calculations.

. NET Parallel Programming 1-

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.