Implementing Concurrency in C #

Source: Internet
Author: User

Performance tests for several methods of concurrency in C # 0x00 cause

A program written last year because of the need to send messages on the LAN to support some commands and simple data transmission, so wrote a C/S communication module. At that time, the practice is simple, the server waits for a link, a user to open a thread, run a while loop to receive data, receive data processing. The thread ends after the user exits (receives the QUIT command). The program has been running normally (of course, the "TCP Sticky Packet", message format encapsulation, etc.), but as more and more people use, and considering the thread overhead is relatively large, if there are 100 user links then the server will create more than 100 threads, 500 users is 500 threads, It's really exaggerated (not that many users, of course). Since TCP communication is not always ongoing, it is possible to store all client connections in a single list, turn on a thread for data reception by polling, and then release the thread after receiving it, so that you can take advantage of the thread pool to avoid a large number of threads consuming memory and CPU.

Polling is a thread pool that implements thread reuse, which is sure to be much smaller in terms of resource overhead, but how often polling is handled in a unit of time than the way you keep threads, this test will resolve this question.

0x01 Experimental method

ide:vs2015

. Net Framework 4.5

The object that receives the data is as follows

Receiving data through the Receivedata method receives only 1% of the probability of receiving data, simulating the case of a TCP server processing n connections by creating n objects to receive data. After all, TCP communication is not always done, of course, this percentage can be adjusted. The output of the program includes how many receive operations are performed per second, the number of threads that receive the data, the content received, and so on.

0x02 keeping threads in concurrency

It is very intuitive to keep the concurrency of threads, which is to open a new thread loop for receivedata operations each time an object is created, and then output the relevant information to the main interface when the data is received. The code looks like this:

0x03 polling concurrency with ThreadPool

The method is to use a list (or other container) to put all the objects in, create a thread (in order to prevent the UI suspended animation, because the thread is created after the execution of the cutting operation is dense, so the use of Theadpool and thread is not very different), in this thread using foreach (or for The loop executes the Receivedata method on each object in turn, creating a thread pool threads to execute each time it executes. The code is as follows:

0X04 uses a task to poll for concurrency

method is similar to ThreadPool, except that the thread that is created through a task is used every time the thread pool threads are created to execute the Receivedata method. The code looks like this:

0x05 using await to poll for concurrency

The method is similar to ThreadPool, except that each time the thread pool threads are created to execute the Receivedata method, the await operation is performed. The code is as follows:

Just beginning to write an await in foreach caused the thread to block, but because Receivedata () in the test in order to open the gap as far as possible not to let the thread sleep to simulate threading operations, resulting in not aware of this problem, thank you at the Wind Fox Reminder.

The modified code looks like this, so that the test method can return immediately. But Async/await did not do it for this.

0x06 using parallel concurrency

This is a method provided by the FCL, and each method in Parallel.ForEach is executed asynchronously, executing using thread pool threads. The code looks like this:

0X07 test Results

Create 500 objects to simulate 500 connection scenarios. Among the test results, the number of received per second will have a fluctuation range, mainly refer to more than hundred. Several methods of using thread pool threads (ThreadPool, Task, await, Parallel) have a slightly different number of threads in the program, which may be related to the execution environment and make it difficult to indicate substantial differences. Where await because thread switching causes thread execution time to take a little longer, so that the thread pool needs to create more threads.

1 , keeping the concurrency of Threads

The average receives 8,654 data per second. 500 threads are created after the start of a task, and memory consumption is high because each thread requires a separate stack space to execute. Switching threads frequently also increases the burden on the CPU.

2 , ThreadPool polling concurrency

The average receives 9,529 data per second. Because of the reuse of thread pool threads, there is no need to create too many threads, there is no fluctuation in memory, and CPU consumption is more uniform.

3 , task polling concurrency

The average 9,322 data received per second, because the task is also based on the thread pool of encapsulation, so with the threadpool result is not very different.

4 , await polling concurrency

The average receives 4,150 times per second. Await also uses thread pool threads, so there is not much difference in memory overhead and number of threads from other methods that use thread pool threads. However, the await waits for the thread to switch the execution context from the thread pool threads, so the CPU overhead is higher.

5 , parallel concurrency

Look at the name. This is designed to be used in this environment, with an average of 9,387 data per second and thread pool threads, so memory and CPU consumption are similar to ThreadPool and task. But you do not need to write a foreach (for) loop, just write the loop body.

6. Supplementary testing

Tested with Receivedata () time-consuming increases, the advantages of polling methods are getting smaller. The performance is that the initial thread execution is inefficient and takes time to catch up. Because the initial thread in the thread pool is not enough, more thread pool threads need to be created, and thread pool threads are not created as fast as thread, but the benefits of polling are reflected when the number of threads in the thread pool gradually satisfies the requirements.

Test 1: test the same 500 threads, 1% of the possible to receive data, but the data received when the simulation operation took 100 milliseconds, the program started very low efficiency, spent about 12 seconds, when the number of threads increased to 54 when the basic stability can meet the demand, efficiency is higher.

Test 2: test the same 500 threads, 1% of the possibility of receiving data, but the simulation takes 500 milliseconds to receive the data, the program just started to be very low efficiency, it took about 150 seconds, when the number of threads increased to 97 when the basic stability can meet the demand, efficiency is higher.

0x08 Conclusion

The first obvious thing to see is that using polling is much more resource-efficient than keeping threads, especially memory. And the way in which efficiency is polled (received 9300-9500 times per second) is higher (8600+ per second) than keeping the thread. Therefore, in this concurrency model, polling should be used to conserve resources and improve concurrency efficiency.

In fact, it is not fair to compare the await, and the await is designed not to be used in this scenario. Regardless of the previous tests on async or concurrency, there is little difference in the thread pool-based scenario. So it is always right to use ThreadPool in the case of a thought. However, some types have packaged threadpool to better accommodate certain special scenarios, so there are tasks, await, parallel, and so on. In this test, it is obvious that parallel is the most appropriate, and the same as the direct use of threadpool compared to the cost of resources and execution efficiency, but less code.

As can be seen in the supplementary tests, the impact of different operating environments on operational efficiency is still significant, so it is important to do more targeted testing of your environment to adopt a more appropriate approach. For example, in my usage environment, the forwarding of TCP messages on the server side and the processing time of some commands are very short. Also assume that the highest simultaneous online 500 users, these 500 users will not be a colleague landing, so there will be no thread pool initial thread is not serious enough to use the situation. As the user slowly logs in, the thread pool threads are slowly increasing as the demand increases, which makes it less noticeable to create thread pool threads. So the way in which I use the environment is undoubtedly appropriate. As a result, only the probability of receiving data is set at the beginning of Receivedata (), with no analog delay. There is a need for you to put the test program down according to the actual situation to adjust the maximum number of concurrent, the probability of receiving data and the time taken to receive data to test.

0X09 Related Downloads

Test code Download Link: https://github.com/durow/TestArea/tree/master/AsyncTest/ConcurrenceTest

Category: C # Tags: C #, concurrency

Implementing Concurrency in C #

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.