Java Theory and Practice: thread pool and Work queue

Source: Internet
Author: User

Why use a thread pool?

Many server applications, such as WEB servers, database servers, file servers, or mail servers, are geared toward handling a large number of short tasks from certain remote sources. Requests arrive at the server in some way, either through a network protocol (such as HTTP, FTP, or POP), through a JMS queue, or possibly by polling the database. Regardless of how the request arrives, a common occurrence in server applications is that a single task has a very short processing time and the number of requests is huge.

A simplistic model for building a server application should be to create a new thread whenever a request arrives, and then service the request in a new thread. In fact, this approach works well for prototyping, but if you try to deploy a server application that runs in this way, the serious lack of this approach is obvious. One of the drawbacks of each request corresponding to one thread (Thread-per-request) method is that it is expensive to create a new thread for each request, and the server that creates the new thread for each request spends more time on creating and destroying the thread and consumes more system resources than the actual user. Ask for more time and resources.

In addition to the overhead of creating and destroying threads, the active thread consumes system resources. Creating too many threads in one JVM can cause the system to run out of memory or "switch over" due to excessive memory consumption. To prevent the lack of resources, the server application requires some means to limit the number of requests processed at any given time.

The thread pool provides a solution to the problem of threading lifecycle overhead and insufficient resources. By reusing threads for multiple tasks, the cost of thread creation is spread across multiple tasks. The benefit is that, because the thread already exists when the request arrives, it inadvertently eliminates the delay caused by the thread creation. This allows the request to be serviced immediately so that the application responds faster. Furthermore, by appropriately adjusting the number of threads in the thread pool, that is, when the number of requests exceeds a threshold, any other new requests are forced to wait until a thread is obtained to handle them, thereby preventing the lack of resources.

Alternative to thread pools

The thread pool is far from the only way to use multithreading within a server application. As mentioned above, it is sometimes wise to generate a new thread for each new task. However, if the task is created too frequently and the average processing time for the task is too short, generating a new thread for each task can cause performance problems.

Another common threading model is to assign a background thread and task queue to a type of task. AWT and Swing Use this model, in which there is a GUI event thread, and all work that causes the user interface to change must be performed in that thread. However, because there is only one AWT thread, it may take a considerable amount of time to perform a task on the AWT thread, which is undesirable. As a result, Swing applications often require additional worker threads for long-running, UI-related tasks.

Each task corresponds to a threading method and a single background thread (Single-background-thread) method that works best in some cases. Each task has a threading method that works very well for a small number of long-running tasks. As long as scheduling predictability is not important, a single background threading method works very well, as is the case with low-priority background tasks. However, most server applications are geared toward handling a large number of short-term tasks or subtasks, and therefore often want to have a mechanism for handling these tasks efficiently and at low cost, as well as some resource management and timing predictability measures. A thread pool provides these benefits.

Work queues

As far as the actual implementation of the thread pool is concerned, the term "thread pool" is somewhat misleading because the thread pool "obvious" implementations do not necessarily produce the results we want in most cases. The term "thread pool" appears before the Java platform, so it may be the product of less object-oriented methods. However, the term continues to be widely used.

Although we can easily implement a thread pool class in which the client class waits for an available thread, passes the task to the thread for execution, and then returns the thread to the pool when the task completes, there are several potential negative effects on this approach. For example, what happens when the pool is empty? The caller who attempts to pass a task to the pool thread will find that the pool is empty and its thread blocks when the caller waits for an available pool thread. One of the reasons why we want to use background threads is often to prevent the thread being committed from being blocked. Completely blocking the "obvious" implementation of the caller, such as the online pool, can eliminate the problem we are trying to solve.

What we usually want is a work queue with a fixed set of worker threads that uses wait () and notify () to inform the waiting line that threading's work has arrived. The work queue is usually implemented as a sort of linked list with the relevant monitor object. Listing 1 shows an example of a simple pooled work queue. Although the Thread API does not impose special requirements on the use of the Runnable interface, this pattern of using Runnable object queues is a public convention for schedulers and work queues.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.