Thread Pool and work queue

Source: Internet
Author: User

Original article: http://www-128.ibm.com/developerworks/cn/java/j-jtp0730/

The thread pool helps achieve optimal resource utilization.

Brian Goetz
Chief Consultant, Quiotix Corp
October 12, 2002

One of the most common problems posted on our multi-threaded Java programming forum is "How to Create a thread pool ?". Thread Pool and working queue problems may occur in almost every server application. In this article, Brian Goetz discusses the thread pool motivation, some basic implementation and optimization techniques, and some common dangers to avoid.
Why use a thread pool?
Many server applications, such as Web servers, database servers, file servers, or email servers, are designed to process a large number of short tasks from some remote sources. Requests arrive at the server in a certain way, which may be through the network protocol (such as HTTP, FTP, or POP), through the JMS queue, or possibly by polling the database. Regardless of how requests arrive, server applications often encounter a situation where a single task is processed for a short period of time and the number of requests is huge.

An overly simple model for building server applications should be: Create a New thread whenever a request arrives, and then serve the request in the new thread. In fact, this method works well for prototype development, but if you try to deploy a server application that runs in this way, the serious shortage of this method is obvious. One of the shortcomings of each request corresponding to a thread (thread-per-request) method is that the overhead of creating a new thread for each request is high; it takes more time and system resources to create and destroy a server that creates a new thread for each request than to process actual user requests.

In addition to the overhead of creating and destroying threads, active threads also consume system resources. Creating too many threads in a JVM may cause the system to use up memory or "over-switching" due to excessive memory consumption ". To prevent resource insufficiency, server applications need some methods to limit the number of requests processed at any given time point.

The thread pool provides a solution for thread lifecycle overhead and resource insufficiency. By reusing threads for multiple tasks, the overhead created by threads is apportioned to multiple tasks. The advantage is that the thread already exists when the request arrives, so the delay caused by thread creation is accidentally eliminated. In this way, the application can immediately respond to the request. In addition, adjust the number of threads in the thread pool appropriately, that is, when the number of requests exceeds a certain threshold, force any other new requests to wait, wait until a thread is obtained for processing to prevent resource insufficiency.

Alternative solution of Thread Pool
The thread pool is far from the only way to use multithreading in server applications. As mentioned above, it is wise to generate a new thread for each new task. However, if the task is created too frequently and the average processing time of the task is too short, generating a new thread for each task will cause performance problems.

Another common thread model is to allocate a background thread and task queue for a type of tasks. AWT and Swing use this model. There is a GUI event thread in this model, and all work that changes the user interface must be executed in this thread. However, since there is only one AWT thread, it may take quite a long time to execute the task in the AWT thread. This is not desirable. Therefore, Swing applications often require additional working threads for tasks related to the UI that run for a long time.

Each task corresponds to a thread method and a single-background-thread method, which works very well in some situations. A single thread method for each task works very well when there are only a few tasks that run for a long time. As long as the scheduling predictability is not very important, a single background thread method works very well, such as low-priority background tasks. However, most server applications are oriented to handling a large number of short-term tasks or subtasks, therefore, we often hope to have a mechanism that can effectively handle these tasks with low overhead and some measures for resource management and regular predictability. The thread pool provides these advantages.

Work queue
In terms of the actual implementation method of the thread pool, the term "Thread Pool" is somewhat confusing, because the "obvious" Implementation of the thread pool does not necessarily produce the expected results in most cases. The term "Thread Pool" comes before the Java platform, so it may be a product of less object-oriented methods. However, this term continues to be widely used.

Although we can easily implement a thread pool class, the client class waits for an available thread, passes the task to this thread for execution, and then returns the thread to the pool when the task is completed, however, this method has several potential negative effects. For example, what happens when the pool is empty? All callers trying to pass tasks to the pool thread will find that the pool is empty. When the caller waits for an available pool thread, its thread will be blocked. One of the reasons we need to use background threads is often to prevent the thread being committed from being blocked. The caller is completely blocked. For example, the "obvious" implementation in the thread pool can prevent the problem that we are trying to solve.

We usually want a work queue that combines the same set of fixed working threads. It uses wait () and notify () to notify the waiting thread that new work has arrived. This work queue is usually implemented as a linked list with related monitor objects. Listing 1 shows a simple example of working queues. Although the Thread API does not impose special requirements on the use of the Runnable interface, this mode of using the Runnable object queue is a public convention between the scheduler and the work queue.

List 1. Working Queues with thread pools

Public class WorkQueue
{
Private final int nThreads;
Private final PoolWorker [] threads;
Private final complete list queue;

Public WorkQueue (int nThreads)
{
This. nThreads = nThreads;
Queue = new queue list ();
Threads = new PoolWorker [nThreads];

For (int I = 0; I <nThreads; I ++ ){
Threads [I] = new PoolWorker ();
Threads [I]. start ();
}
}

Public void execute (Runnable r ){
Synchronized (queue ){
Queue. addLast (r );
Queue. Sort y ();
}
}

Private class PoolWorker extends Thread {
Public void run (){
Runnable r;

While (true ){
Synchronized (queue ){
While (queue. isEmpty ()){
Try
{
Queue. wait ();
}
Catch (InterruptedException ignored)
{
}
}

R = (Runnable) queue. removeFirst ();
}

& Nb

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.