In the previous Post, we have read several common synchronization types. This article will introduce the implementation of declarative synchronization and the sharing of Set-type data, first, let's look at the implementation of declarative synchronization.
Declarative Synchronization
We can identify a class by using the Synchronization feature, so that all fields and methods of a type can be synchronized. When Synchronization is used, we need to inherit the class of our target Synchronization from the System. ContextBoundObject type. Let's take a look at the implementation of Synchronization in the previous example:
[Synchronization]
Class SumClass: ContextBoundObject
{
Private int _ Sum;
Public void Increment ()
{
_ Sum ++;
}
Public int GetSum ()
{
Return _ Sum;
}
}
Class Program
{
Static void Main (string [] args)
{
Var sum = new SumClass ();
Task [] tasks = new Task [10];
For (int I = 0; I <10; I ++)
{
Tasks [I] = new Task () =>
{
For (int j = 0; j <1000; j ++)
{
Sum. Increment ();
}
});
Tasks [I]. Start ();
}
Task. WaitAll (tasks );
Console. WriteLine ("Expected value {0}, Parallel value: {1 }",
10000, sum. GetSum ());
Console. ReadLine ();
}
} Concurrent set
In some cases, we need to concurrently operate the set type, and there is also a Data Sharing Problem for the set type. Let's look at an example. In this example, we create a queue, then, multiple tasks are synchronized to the output queue to see if an element is synchronized. Here, we use a counter to calculate the number of Queues:
Static void Main (string [] args)
{
For (var j = 0; j <10; j ++)
{
Var queue = new Queue <int> ();
Var count = 0;
For (var I = 0; I <1000; I ++)
{
Queue. Enqueue (I );
}
Var tasks = new Task [10];
For (var I = 0; I <tasks. Length; I ++)
{
Tasks [I] = new Task () =>
{
While (queue. Count> 0)
{
Var item = queue. Dequeue ();
Interlocked. Increment (ref count );
}
});
Tasks [I]. Start ();
}
Try
{
Task. WaitAll (tasks );
}
Catch (aggresponexception e)
{
E. Handle (ex) =>
{
Console. WriteLine ("Exception Message: {0}", ex. Message );
Return true;
});
}
Console. WriteLine ("Dequeue items count: {0}", count );
}
Console. ReadKey ();
} In the above example, we can view the results of ten running operations to make the effect better:
We can see two problems above: 1. The number of outgoing queues exceeds 1000; 2. The exception message is empty. The reasons for these two exceptions are: 1. multiple Threads simultaneously operate on the action of one element; 2. The message is empty because when the last element is left in the queue, a thread accesses the "queue. count> 0 "is indeed greater than 0, so the queue is ready to be output in the loop code, and another thread also enters the loop, and the queue has been exhausted, data contention occurs.
In. Net 4.0, a lot of concurrent set types are provided to let us deal with the collection of data synchronization, including:
1. ConcurrentQueue: provides a set of concurrent and secure queues for first-in-first-out operations;
2. ConcurrentStack: provides a set of concurrent and secure stacks for advanced and later operations;
3. ConcurrentBag: A special sorting set that provides concurrency security;
4. ConcurrentDictionary: a key-value set that provides concurrency security.
Here, we only try ConcurrentQueue. A concurrent queue is a thread-safe queue set. We can queue and TryDequeue () through Enqueue () for output queue operations, in the preceding example, when we use ConcurrentQueue:
Static void Main (string [] args)
{
For (var j = 0; j <10; j ++)
{
Var queue = new ConcurrentQueue <int> ();
Var count = 0;
For (var I = 0; I <1000; I ++)
{
Queue. Enqueue (I );
}
Var tasks = new Task [10];
For (var I = 0; I <tasks. Length; I ++)
{
Tasks [I] = new Task () =>
{
While (queue. Count> 0)
&