Better use of JAVA thread pool and JAVA Thread Pool

Source: Internet
Author: User

Better use of JAVA thread pool and JAVA Thread Pool

This article describes the thread pool size parameters, creation of working threads, collection of Idle threads, use of blocked queues, task denial policies, and thread pool hooks. use, some details are involved, including the selection of different parameters, different queues, and different denial policies, the impact and behavior, and the knowledge base for better use of the thread pool, the noteworthy parts are marked in bold.

ExecutorService executes user-submitted tasks based on pooled threads. Generally, ThreadPoolExecutor instances can be created simply by using the factory method provided by Executors.

The thread pool solves two problems: 1) the thread pool optimizes the system performance when executing a large number of asynchronous tasks by reducing the performance consumption generated during each task. 2) The thread pool also provides methods to restrict and manage resources and threads consumed when Batch Tasks are executed. In addition, ThreadPoolExecutor provides a simple statistical function, such as how many tasks have been executed.

Quick Start

To make the thread pool suitable for a large number of different application context environments, ThreadPoolExecutor provides many configurable parameters and hooks that can be extended. However, you can use some factory methods provided by Executors to quickly create a ThreadPoolExecutor instance. For example:

If the instance created in the above method cannot meet our needs, we can configure it by parameters and instantiate an instance.

You need to know about the number of threads and the parameter settings

ThreadPoolExecutor dynamically adjusts the size of the thread pool according to corePoolSize and maximumPoolSize.

When a task is submitted to the thread pool through executor, we need to know the following points:

Core Thread WarmUp

By default, the core worker thread value is created at the initial time and started when a new task arrives. However, we can change this behavior by rewriting the prestartCoreThread or prestartCoreThreads method. In general scenarios, We Can WarmUp the Core Thread when the application is started, so as to achieve the immediate execution result of the task and optimize the processing time of the initial task.

Create custom worker threads

The new thread is created through ThreadFactory. If it is not specified, the default Executors # defaultThreadFactory will be used. At this time, all created threads will belong to the same thread group, has the same priority and daemon status. Extension of ThreadFactory configuration, We can configure the thread name and thread combination daemon status. If ThreadFactory # createThread fails to be called, null is returned and executor will not execute any tasks.

Idle thread recycling

If the number of worker threads in the current pool is greater than corePoolSize, if the idle time of threads beyond this number is greater than keepAliveTime, these threads will be terminated, this is a strategy to reduce unnecessary resource consumption. This parameter can be changed at runtime. We can also apply this policy to the Core Thread, which can be implemented by calling allowCoreThreadTimeout.

Select an appropriate blocking queue

All blocking queues can be used to store tasks. However, different queues have different behaviors for corePoolSize:

When the number of worker threads in the pool is smaller than the corePoolSize, a new worker thread will be created each time a task arrives.

When the number of worker threads in the pool is greater than or equal to corePoolSize, each task first tries to put the thread into the queue rather than directly creating the thread.

When the number of threads in the first pool is smaller than maximumPoolSize, a working thread is created.

The following describes the performance of different queue policies:

Direct submission:A better default choice is to use SynchronousQueue, which directly transmits the submitted task to the working thread without holding it. If there is no working thread for processing, that is, the task fails to be put into the queue, a new working thread will be created based on the implementation of the thread pool, so the new submitted task will be processed.This strategy avoids lock contention when a batch of submitted tasks are dependent.It is worth mentioning that this policy is best used with the number of unbounded threads to prevent the task from being rejected. At the same time, we must consider a scenario. When a task arrives at a speed greater than the task processing speed, there will be an increasing number of unlimited threads.

Unbounded queue:When you use an unbounded queue, for example, when the maximum capacity of the queue blockingqueue is not specified,This will cause new tasks to be placed in the queue when the core threads are busy. Therefore, there will never be any threads larger than corePoolSize, so the maximumPoolSize parameter will be invalid.This policy is suitable for all tasks that are independent of each other. For example, in a web server, each thread processes Requests independently. However, when the processing speed of a task is less than the entry speed of the task, the queue will expand infinitely.

Bounded queue:Bounded queues, such as ArrayBlockingQueue, help limit resource consumption, but are not easy to control. The queue length and maximumPoolSize values affect each other. Using a large queue and a small maximumPoolSize reduces CPU usage, operating system resources, and context switching consumption, but reduces throughput, if a task is frequently blocked, such as an I/O thread, the system can schedule more threads. Using a small queue usually requires a large maximumPoolSize to make the CPU more busy, but it will increase the thread scheduling consumption to reduce the throughput. To sum upIf it is IO-intensive, you can consider more threads to balance CPU usage. If it is CPU-intensive, you can consider reducing the number of threads to reduce Thread Scheduling consumption.

Select a suitable denial Policy

When a new task arrives and the thread pool is closed, or the number of threads and queue have reached the upper limit, we need to make a decision on how to reject these tasks. The following describes common policies:

ThreadPoolExecutor # AbortPolicy: this policy directly throws a RejectedExecutionException.

ThreadPoolExecutor # CallerRunsPolicy: this policy uses the Caller thread to execute this task. This is a feedback policy that can reduce the task submission speed.

ThreadPoolExecutor # DiscardPolicy: this policy will directly discard the task.

ThreadPoolExecutor # DiscardOldestPolicy: this policy will discard the task in the task queue header and re-execute the policy. If it still fails, continue to implement the policy.

In addition to the above several policies, we can also implement our own policies by implementing RejectedExecutionHandler.

Use Hook to embed your behavior

ThreadPoolExecutor provides a method that can be overwritten by the protected type, allowing you to do something before the task is executed. We can use it to perform operations such as initializing ThreadLocal, collecting statistics, and recording logs. Such hooks include beforeExecute and afterExecute. Another Hook can be used to insert logic such as rerminated when the task is executed.

If the hook method fails to be executed, the execution of the internal working thread will fail or be interrupted.

Accessible queue

The getQueue method can be used to access the queue for some statistics or debugging. We do not recommend this method for other purposes. The remove and purge methods can also be used to remove tasks from the queue.

Disable Thread Pool

When the thread pool is not referenced and the number of worker threads is 0, the thread pool is terminated. We can also call shutdown to manually terminate the thread pool. If we forget to call shutdown, we can use keepAliveTime and allowCoreThreadTimeOut to release the thread resources.

Conclusion

The APIS provided by JAVA allow us to quickly develop multiple threads Based on the thread pool, but we must be responsible for the code we write, each parameter setting and policy selection have an absolute relationship with different application scenarios. However, it is not easy to select different parameters and policies. We must first answer some basic questions: what does the operating system do for every thread we create, what is the main resource consumption of the operating system of this thread? If my application scenario is IO-intensive, do I need more or fewer threads? What should we choose if we account for about half of the CPU and IO operations? And so on. In my opinion, multi-threaded development is also a very easy task.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.