Java Multithreading Core Knowledge

Source: Internet
Author: User
Tags thread class volatile wrapper

Transferred from: https://www.oschina.net/question/3756233_2277445

Multi-line threads for other Java knowledge points, there is a certain threshold for learning, and it is more difficult to understand. In peacetime work if improper use will appear data confusion, inefficient execution (as a single-threaded to run) or deadlock program hangs and so on, so mastering the understanding of multithreading is critical.

From the beginning of the basic concept to the final concurrency model, this paper explains the knowledge of the thread.

Concept Grooming

In this section I'll show you a few of the basic concepts in multi-threading.

Concurrency and parallelism

Parallel, representing two threads doing things at the same time.
concurrency, indicating that one will do this, one will do another thing, there is a dispatch. Single-core CPUs cannot exist in parallel (microscopic).


critical Section
A critical section is used to represent a common resource or shared data that can be used by multiple threads. But each time, only one thread can use it, and once the critical section resource is occupied, other threads must wait for the resource to be used.


blocking and non-blocking

Blocking and non-blocking are often used to describe the interaction between multiple threads. For example, if a thread occupies a critical section resource, all other threads that need the resource must wait in this critical section, waiting for the thread to hang. This situation is blocking. At this point, if the resource-hogging thread has been reluctant to release resources, all other threads that are blocking the critical section will not work. Blocking refers to a thread being suspended at the operating system level. Blocking general performance is not good, it takes about 80,000 clock cycles to do scheduling. Non-blocking allows multiple threads to enter the critical section at the same time.

dead Lock

Deadlock is the short name of a process deadlock, which is a situation in which multiple processes are waiting indefinitely for the resources to be occupied by other parties.


Live lock  

Assume that there are two threads 1, 2, they all need a resource A/b, assuming that the Line 1 line takes possession of a resource, Line 2 route occupies B resources; because two threads need to have both resources to work, In order to avoid deadlock, the Line 1 line to release a resource possession lock, Line 2 line to release the B resource possession lock, at this time AB idle, two threads at the same time to grab the lock, again the above situation occurs, at this time a live lock. Simple analogy, elevator met people, one into one out of the road, the opposite, two people at the same time in one direction to make way, back and forth, or blocked road. If the online application encounters a live lock problem, congratulations on winning the lottery, this kind of problem is more difficult to troubleshoot.  

Hunger  

Hunger means that one or more threads are unable to obtain the required resources for a variety of reasons, resulting in the inability to execute. &NBSP

thread life cycle  

The life cycle of threads, it goes through several states that are created, run, and not run.  

Create status  

When you create a new thread object by using new operator, the thread is in the created state.  
The thread in the creation state is simply an empty thread object, and the system does not allocate resources for it. &NBSP

operational Status  

The Start () method of the execution thread assigns the required system resources to the thread, schedules it to run, and calls the thread body--run () method, This causes the thread to be in a running state (Runnable).  
This state is not in the running state (Running) because the thread may not actually run. &NBSP

non-operational  

When the following events occur, the running thread goes into a non-operational state:  

    • Called The Sleep () method;
    • The thread calls the Wait () method to wait for the specific condition to be satisfied;
    • Thread input/output blocking;
    • return to operational status;
    • A thread that is asleep after the specified time passes;
    • If the thread is waiting for a condition, the other object must notify the waiting-thread condition through the Notify () or Notifyall () method;
    • If the thread is blocked by the input output, wait for the input output to complete.

priority of the thread

thread priority and Settings

Threads are prioritized in order to facilitate system-to-thread scheduling in multithreaded environments, with high priority threads executing. The priority settings for a thread follow these guidelines:

    • When a thread is created, the child inherits the parent's priority;
    • After the thread is created, the priority can be changed by calling the SetPriority () method;
    • The priority of a thread is a positive integer between 1-10.

scheduling policies for Threads

The thread scheduler chooses the highest-priority thread to run. However, the run of the thread is terminated if the following conditions occur:

    • The yield () method is called in the thread body, which yields the CPU occupancy rights;
    • The sleep () method is called in the thread body to put the thread into sleep;
    • The thread is blocked because of I/O operation;
    • Another higher-priority thread appears;
    • In a system that supports time slices, the time slice of the thread runs out.

Single Thread creation method

Single-threaded creation is simpler, generally there are only two ways: Inherit the Thread class and implement the Runnable interface, these two methods are not commonly used in the Demo, but for beginners need to pay attention to the problems are:

    • Whether inheriting the thread class or implementing the Runable interface, the business logic is written in the Run method, and the thread starts by executing the start () method;
    • Opening a new thread without affecting the code execution order of the main thread does not block the execution of the main thread;
    • New threads and the code execution order of the main thread are not guaranteed to be sequential;
    • For multi-threaded programs, from a microscopic point of view only one thread at work, multi-threaded purpose is to let the CPU busy;
    • By looking at the source code of thread, we can see that the thread class implements the Runnable interface, so these two are essentially one;

PS: Usually in the work can also learn from this code structure, the upper call to provide more choices, as a service provider core business to a maintenance

Why use a thread pool

Through the above introduction, it is possible to develop a multithreaded program, why the introduction of the thread pool. This is mainly because of the following issues with the single-threaded approach:

    • Thread's work cycle: thread creation takes time to T1, the thread takes a task to execute the time is T2, the thread destroys the time is T3, often t1+t3 is greater than T2, all if the frequent creation of threads will wear too much extra time;
    • If there is a task, then to create a thread is less efficient, and if you can get the available threads directly from a pool, the efficiency will improve. Therefore, the thread pool eliminates the task, to create a thread before the process of execution, saving time, improve efficiency;
    • The thread pool can manage and control threads, because threads are scarce resources, and if created indefinitely, not only consumes system resources, but also reduces system stability, and the thread pool can be used for uniform allocation, tuning and monitoring;
    • The thread pool provides queues that hold buffers for tasks to perform.

Summarizes the above several reasons, so you can come to a conclusion is that in peacetime work, if you want to develop multithreaded programs, try to use the thread pool to create and manage threads.

Creating threads from the thread pool is divided into two from the invocation API perspective, one of which is the native thread pool, and the other one is created by the concurrency package provided by Java, which is simpler, which is a simple wrapper for the native thread pool creation, making it easier for callers to use, but the same is true. So it is very important to understand the principle of the native thread pool.

Threadpoolexecutor

To create a thread pool from Threadpoolexecutor, the API is as follows:

public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize, long keepAliveTime,TimeUnit unit,BlockingQueue<Runnable> workQueue); 

First, explain the meaning of the parameters (if you look at the fuzzy can have a general impression, the following figure is the key).

    • Corepoolsize
    • The size of the core pool.

After the thread pool has been created, by default, there are no threads in the thread pools, instead of waiting for a task to be created to perform the task, unless the prestartallcorethreads () or Prestartcorethread () method is called. As you can see from the names of these two methods, the meaning of the pre-created thread is to create a corepoolsize thread or a thread before the task arrives. By default, after the thread pool has been created, the number of threads in the threads pools is 0, and when a task comes, a thread is created to perform the task, and when the number of threads in the thread pool reaches corepoolsize, the incoming task is placed in the cache queue.

    • Maximumpoolsize

Thread pool maximum number of threads, this parameter is also a very important parameter, which represents the maximum number of threads that can be created in a thread pool.

    • KeepAliveTime

Indicates how long the thread will stay when no task is executed. By default, KeepAliveTime only works if the number of threads in the thread pool is greater than corepoolsize, until the number of threads in the thread pool is not greater than corepoolsize, that is, when the number of threads in the thread pool is greater than corepoolsize. If a thread is idle for KeepAliveTime, it terminates until the number of threads in the thread pool does not exceed corepoolsize.
However, if the Allowcorethreadtimeout (Boolean) method is called, the KeepAliveTime parameter also works when the number of threads in the thread pool is not greater than corepoolsize, until the number of threads in the thread pool is 0.

    • Unit

The time unit of the parameter KeepAliveTime.

    • WorkQueue

A blocking queue, which is used to store tasks waiting to be executed, and the choice of this parameter is also important, which can have a significant impact on the running process of the thread pool, in general, where the blocking queue has the following options: Arrayblockingqueue, Linkedblockingqueue, Synchronousqueue.

    • Threadfactory

Thread Factory, which is used primarily to create threads.

    • Handler

Represents the following four types of values when a policy is rejected when processing a task:

    • Threadpoolexecutor.abortpolicy: Discards the task and throws the rejectedexecutionexception exception;
    • Threadpoolexecutor.discardpolicy: Also discards the task, but does not throw the exception;
    • Threadpoolexecutor.discardoldestpolicy: Discards the first task in the queue and then attempts to perform the task again (repeating the process);
    • Threadpoolexecutor.callerrunspolicy: The task is handled by the calling thread.

How do the above parameters work together? Please see:


Note the sequence number above the figure.

A simple summary of the parameter collaboration between the thread pool is divided into the following steps:

    • Threads are first submitted to Corepool;
    • After the Corepool is full, the thread is submitted to the task queue, waiting for the thread pool to be idle;
    • After the task queue is full and the corepool is not idle, the task is committed to Maxpool, and if the Maxpool is full then a task reject policy is executed.

The flowchart is as follows:


These are the core principles of the native thread pool creation. In addition to the native thread pool, the package also provides a simple way to create, which also says that they are a wrapper for the native thread pool, which allows developers to create the required thread pool quickly and easily.
Executors

Newsinglethreadexecutor
Create a thread pool of threads where only one thread is always present. If a thread in the thread pool exits because of an exception problem, a new thread will replace it. This thread pool guarantees that the order in which all tasks are executed is performed in the order in which the tasks are submitted.

Newfixedthreadpool
Creates a fixed-size thread pool. Each time a task is committed, a thread is created until the thread reaches the maximum size of the threads pool. Once the maximum size of the thread pool is reached, the thread pool will be replenished with a new thread if it ends up executing an exception.

Newcachedthreadpool
Depending on the actual situation, the thread pool that adjusts the number of threads, the number of threads in the thread pools is indeterminate, and if there are idle threads that prefer idle threads, if there are no idle threads and a task commit at this time creates a new thread. This thread pool is not recommended in normal development because, in extreme cases, CPU and memory resources are exhausted because Newcachedthreadpool creates multithreading.

Newscheduledthreadpool
This thread pool can specify a fixed number of threads to be executed periodically. For example, by scheduleatfixedrate or Schedulewithfixeddelay to specify the cycle time.
PS: In addition to write timed tasks (if not Quartz framework), it is best to use this thread pool to do, because it can ensure that there is always a live thread.

recommended way to use Threadpoolexecutor

In the case of Ali's Java development manual, it is not recommended to use executors to create, but rather recommend to use Threadpoolexecutor to create a thread pool.
The main reason for this is that the use of executors to create a thread pool does not pass in the core parameters, but rather the default values, so we tend to ignore the meaning of the inside parameters, if the business scenario requirements are more stringent, there is a risk of resource exhaustion, in addition to the use of Threadpoolexecutor's approach allows us to better understand the thread pool's rules of operation, whether it's an interview or a great benefit for technology growth.
Variables are changed and other threads can immediately know. There are several ways to ensure visibility:

    • Volatile

Variables added to the volatile keyword will be compiled with a lock prefix directive, which is equivalent to a memory barrier, the memory barrier can guarantee the order of memory operations. When a variable declared as volatile is written, the variable needs to write the data to main memory.
Because the processor implements the cache consistency protocol, writing to main memory causes the cache of other processors to be invalid, that is, the thread's working memory is invalid and the data needs to be refreshed from the main memory.

Recommended Learning:

http://www.roncoo.com/course/view/b6f89747a8284f44838b2c4da6c8677b

Java Multithreading Core Knowledge

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.