How to determine the thread pool size for Web apps

Source: Internet
Author: User

When deploying a Web application to a production environment, or when performing performance tests on Web applications, people often ask: how do I determine the size of the Web application thread pool? Determining the thread pool size of an IO-blocking Web application is a daunting task. This is usually done by doing a lot of performance testing. Having multiple thread pools in a single Web application can complicate the process of determining the optimal thread pool size. This article will carry out some discussion and suggestions on this common problem.
The thread pool size in the thread pool Web app determines the number of concurrent requests that can be processed within a specified time. If a web app receives a higher number of requests than the thread pool size, the extra request will go into the queue or be rejected.
Be aware that concurrency and parallelism are not a concept. Concurrent requests refer to the number of requests being processed, and at some point only a small fraction of them can be executed by the CPU. Parallel requests refer to the number of requests being processed, at some point in time, when all requests are being executed by the CPU.
In non-blocking IO applications, such as NodeJS, a single thread (process) can handle multiple requests at the same time. A multi-core CPU processor can handle parallel requests by increasing the number of threads or processes.
In a blocking IO application, such as SPRINGMVC, a single thread can only process one request at a time. To handle multiple concurrent requests at the same time, we must increase the number of threads.
Compute-intensive applications in compute-intensive applications, the size of the thread pool should be equal to the number of CPUs in the host. Adding more threads will interrupt the processing of the request because the context switch of the thread also delays the response time.
Non-blocking IO applications will be CPU intensive because there is no thread waiting time when requests are processed.
The IO wait application determines that the thread pool size of an IO wait application becomes more complex due to dependency on the response time of the downstream system, because one thread is always blocked before the other system responds. We have to increase the number of threads to increase CPU utilization, as discussed in "Responder mode: I/O blocking applications".
The law of the Law of Delphi applies to non-technical areas, such as banks, to estimate the number of bank cashier counters that are required to process incoming bank customers.
The Law of Delphiin a stable system, the average customer number L observed for a long time is equal to the effective arrival rate observed for a long time, λ, and the product of the average customer's time spent in the system: L =ΛW
the Delphi Law for Web applicationsthe average number of threads in a system (Threads), equal to the arrival rate of the Web request (webrequests per sec), and the product of the average response time (responsetime) for each processing
Threads = number of threads
Webrequests per sec = Number of Web requests that can be processed in one second
ResponseTime = time required to process a Web request
Threads = (webrequests/sec) X responsetime
Although the above formula provides the number of threads that handle incoming requests, it does not provide ratio information between the number of threads and the number of CPU cores, such as how many threads a host of x CPUs needs to allocate.
Test determines the thread pool size to find the right thread pool size, you need to trade off between throughput and response time. Start the test with a minimum: one CPU thread (that is, the thread pool size = number of CPUs), and the application thread pool size increases proportionally to the average response time of the downstream system until the CPU usage is saturated or the response time begins to degenerate.
The correlation between the number of requests, CPU and response time is pointed out.
The number of CPU Vs requests illustrates the CPU utilization when increasing the load on the Web application.
The response time Vs request graph illustrates the impact of increasing Web application load on response time.
The green point indicates the optimal throughput and response time.
thread Pool size = number of CPUs


Describes what happens when the IO-wait application is equal to the number of CPUs in the thread. The applied thread has blocked while waiting for the downstream system to respond. Because the threads are blocked, the system response time is stretched as the request enters the waiting queue. Because all threads are in a blocking state, the app starts rejecting requests, although the CPU usage is still low.
The thread pool is large .


Describes a scenario in which an IO-waiting application creates many threads in a Web application. Because there are a number of threads, the context switch of the thread will be frequent. Due to unnecessary thread context switching, the CPU usage of the application is already high even though the throughput is not up. The response time is elongated because the requested processing is interrupted by the context switch of the thread.
optimal thread pool size


Describes a scenario in which an IO-waiting application creates a reasonable number of threads in a Web application. The CPU has been effectively utilized, with good throughput and less thread context switching. We can see that because of fewer interruptions (context switching), request processing is more efficient and the application has a good response time.
Thread pool isolation for most Web applications, there are only a few types of Web requests that take a long time to process. These slow request processing may drag down all threads and reduce the performance of the entire application.
The two scenarios that address this problem are:
    • Set up a stand-alone host for slow-processing Web requests;
    • Allocate a separate thread pool for slow-processing Web requests in the same application;
Determining the thread pool size of an IO-blocking Web application is a daunting task. This is usually done by doing a lot of performance testing. Having multiple thread pools in a single Web application can complicate the process of determining the optimal thread pool size.
Original link: http://venkateshcm.com/2014/05/How-To-Determine-Web-Applications-Thread-Poll-Size/.

How to determine the thread pool size for Web apps

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.