The life cycle and transformation of threads

Source: Internet
Author: User
Tags thread class

Threads (thread, Taiwan) are a single sequential control flow in the process. Also known as the lightweight process (lightweight processes). Computer Science terminology refers to the scheduling unit of a running program.

A thread is an entity in a process, a process can have multiple threads, and a thread must have a parent process. A thread does not own system resources, only some data structures that must be run, and it shares all the resources owned by the process with other threads of the parent process. Threads can create and undo threads, enabling concurrent execution of programs. Generally, threads have three basic states of ready, blocking, and running.

In a multi-CPU system, different threads can run simultaneously on different CPUs, even when they belong to the same process. Most multiprocessor-enabled operating systems provide a programming interface to allow processes to control the degree of association between their threads and the processors (affinity).

Sometimes threads are also referred to as lightweight processes. Like processes, threads are separate, concurrent execution paths in a program, with each thread having its own stack, its own program counter, and its own local variables. However, the level of isolation between threads in a process is less than that of separated processes. They share memory, file handles, and other states that each process should have.

A process can support multiple threads that appear to execute simultaneously but are not synchronized with each other. Multiple threads in a process share the same memory address space, which means they can access the same variables and objects, and they allocate objects from the same heap. Although this makes it easier to share information between threads, you must be careful to ensure that they do not interfere with other threads in the same process.

Java threading Tools and APIs seem simple. However, it is not easy to write complex programs that use threads effectively. Because there are multiple lines Cheng in the same memory space and share the same variables, you must be careful to ensure that your threads do not interfere with each other.

first, the concept of the thread

Generally speaking, the program that is executing on the computer is called a process, and not called a program. The so-called "thread" is a single sequential control flow in the process.

Emerging operating systems, such as Mac,windows Nt,windows 95, mostly use the concept of multithreading, the thread as a basic execution unit. Threads are also one of the most important components of Java.

Even the simplest applets are done by multiple threads. In Java, the paint () and update () methods of any applet are invoked by the AWT (Abstract Window Toolkit) Drawing and the event handling thread, while the applet's main milestone Method--init (), Start (), The Stop () and destory ()-are called by the application that executes the applet.

There is nothing new in the concept of single-threaded, it is really interesting to use multiple threads in a program to accomplish different tasks. In some places, a lightweight process (Lightweig HT process) is used to replace a thread, and the similarity between threads and real processes is that they are all in a single sequential control flow. However, a thread is considered lightweight because it runs within the context of the entire program and can use the resources and program Environment common to the entire program.

As a single sequential control flow, the running of the program inside the thread must have some resources as necessary overhead. For example, there must be an execution stack and a program counter. Code executed within a thread works only in its context, so some places use the "execution context" instead of "thread".

Second, threading properties

In order to use threads correctly and efficiently, you must understand all aspects of the thread and understand the Java real-time system. You must know how to provide the thread body, the thread's lifecycle, how real-time systems dispatch threads, thread groups, and what Phantom Threads (Demo nthread).

(1) Thread body
All operations occur in the thread body, in the Java centerline Cheng is the run () method inherited from the thread class, or the run () method in the class that implements the Runnable interface. When the thread is generated and initialized, the real-time system calls its run () method. The code within the run () method implements the behavior of the resulting thread, which is the main part of the thread.

(2) Thread state
The figure represents the state of the thread at any point in its lifecycle and the method that causes the state to change. The graph is not a complete finite state diagram, but it basically outlines the more interesting and general aspects of the thread. The following discussion is about the thread lifecycle.


New Thread State
Produces a thread object and generates a new one. When a thread is in a "new thread" state, it is only an empty thread object, and it is not yet allocated to system resources. Therefore, it can only be started or terminated. Any other action throws an exception.

Operational state (Runnable)
The start () method produces the resources necessary to run the thread, dispatches the thread to execute, and invokes the thread's run () method. At this point the thread is in a running state. This state is not known as a running state because the thread does not always occupy the processor. In particular for PCs with only one processor, there can be only one thread occupancy processor in a running state at any given time. Java through scheduling to achieve multithreading of the sharing of processors.

Non-running state (not Runnable)
When the following event occurs, the thread enters the runtime state.
The ①suspend () method is invoked;
The ②sleep () method is invoked;
The ③ thread waits for the condition variable by using wait ();
The ④ thread is in I/O wait.

Death State (Dead)
When the Run () method returns, or another thread calls the Stop () method, the thread enters the dead state. Typically, the applet uses its stop () method to terminate all threads it produces.

(3) Thread priority
Although we say that threads are running concurrently. But that is often not the case. As mentioned earlier, when there is only one CPU in the system, multithreading in a single CPU in some order is called dispatch (scheduling). Java uses a simple, fixed scheduling method, namely fixed priority scheduling. This algorithm is based on the relative priority of the thread in the running state to implement the scheduling. When a thread is generated, it inherits the priority of the original thread. The priority can be modified as needed. At any given time, if there are multiple threads waiting to run, the system chooses the highest-priority running thread to run. It is only possible to run if it stops, automatically abandons, or becomes a non running low priority thread for some reason. If two threads have the same priority, they will be run alternately.

The thread scheduling algorithm for the Java Real time system is mandatory, and the real-time system will select the thread to run at any given time if one of the threads with higher priority than the other threads becomes a running state.

(4) Phantom thread
Any Java thread can be a phantom thread. It is a service provider that runs on objects and threads within the same process. For example, the HotJava browser has a phantom thread called the background picture Reader, which reads pictures from the file system or network for objects and threads that require pictures.

The Phantom thread is a typical stand-alone thread in the application. It provides services to other objects and threads in the same application. The Phantom Thread's Run () method is typically an infinite loop that waits for a service request.

(5) Thread Group
Each Java thread is a member of a thread group. A thread group provides a mechanism by which multiple threads are set within an object and can be manipulated as a whole. For example, you can use a method call to start or suspend all threads within a group. Java thread groups are implemented by the Threadgroup class.

When a thread is generated, you can specify a thread group or put it in a default thread group by the real-time system. A thread can belong to only one thread group, and cannot change the thread group it belongs to when the thread is produced.

iii. Multi-thread routines

This is not much to say about the benefits of multithreading. But it has also brought some new problems. It is not too difficult to overcome these problems as long as you pay special attention to the design process.

(1) Sync thread
Many threads must consider sharing data or coordinating execution state with other threads in execution. This requires synchronization mechanisms. In Java, each object has a lock corresponding to it. However, Java does not provide separate lock and unlock operations. It is implemented implicitly by the high level structure to ensure the corresponding operation. (However, we note that the Java Virtual machine provides separate Monito renter and monitorexit directives to implement lock and Unlo
CK operation. )

The synchronized statement evaluates an object reference, attempts to complete a lock operation on the object, and stops processing before the lock operation completes. When the lock operation completes the synchronized statement body is executed. When the statement body completes (whether normal or abnormal), the unlock operation completes automatically. As an object-oriented language, synchronized is often used with methods. A better approach is that if a variable is assigned by a thread and referenced or assigned by another thread, all access to that variable must be within a synchromized statement or synchronized method.

Now let's assume that thread 1 and thread 2 both have access to a data area and require that thread 1 access be preceded by thread 2, then using synchronized alone is not the answer. This can be implemented in UNIX or Windows NT using Simaphore. And Java does not provide. The Wait () and notify () mechanisms are provided in Java. Use the following:

Synchronized method-1 (...) {Call by thread 1.
∥access data area;
Available=true;
Notify ()
}
Synchronized method-2 (...) {∥call by thread 2.
while (!available)
try{
Wait (), ∥wait for Notify ().
}catch (interrupted Exception e) {
}
∥access Data area
}
Where available is a class member variable, the initial value is false.

If available is checked for false in Method-2, then call Wait (). The function of Wait () is to get thread 2 into a non running state and unlock it. In this case, method-1 can be invoked by thread 1. When the Notify () is executed. Thread 2 is transformed from a non-running state to a running state. When the method-1 call returns. Thread 2 allows the object to be locked again, and the instruction after wait () is returned after the lock succeeds. Such a mechanism could also be applied to other, more complex situations.

(2) Deadlock
If there are several concurrent threads in the program that compete for resources, it is important to ensure that the balance is balanced. System equalization means that each thread has full access to limited resources during the execution process. There are no threads of starvation and deadlock in the system. Java does not provide a mechanism for detecting deadlocks. Preventing deadlocks is a good choice for most Java programmers. The simplest way to prevent deadlocks is to introduce serial numbers to competing resources, and if a thread requires several resources, it must first get the small number of resources, and then request a large number of resources.

Iv. Comparison of threads and processes


A process is the basic unit of resource allocation. All resources associated with the process are recorded in the Process control block PCB. To indicate that the process owns these resources or is using them.
In addition, the process is also a preemptive processor scheduling unit, it has a complete virtual address space.

In contrast to a process, a thread is independent of a resource assignment, belongs to a process, and shares the resources of the process with other threads within the process.
When a process is scheduled, different processes have different virtual address spaces, while different threads within the same process share the same address space.

Threads consist only of the related stack (system stack or user stack) registers and the thread control table TCB. Registers can be used to store local variables within a thread, but cannot store related variables for other threads.

When a process switch occurs compared to the occurrence of a thread switch, process switching involves the preservation of resource pointers and the change of address space; When a thread switches, the cost of the operating system is reduced by the fact that the resource and address spaces are shared by the threads in the same process, and the resource and address changes are not involved. Furthermore, the scheduling and switching of processes are performed by the operating system kernel, while threads can be completed by the operating system kernel or by user programs.

Figure 1 The relationship between multiple threads and processes

v. Scope of application of threads

A typical application

1. File management or communication control in the server

2. Front and Back table processing

3. Asynchronous processing


vi. execution characteristics of threads

A thread must be in one of the following four possible states:

Initial state: The state in which a thread calls the new method and before calling the Start method. In the initial state, you can call the start and stop methods.

Runnable: Once the thread calls the Start method, the thread is transferred to the Runnable state, noting that if the thread is in the Runnable state it may not be running, because there are also priority and scheduling issues.

Blocking/nonrunnable: The thread is in a blocking/nonrunnable state, which is caused by two possibilities: either suspended or blocked for some reason, including waiting for the completion of the IO request. Exit: The thread is off to the exit state, there are two possibilities, either the Run method finishes, or the Stop method is invoked.

The last concept is the priority of the thread, which can be prioritized, and high-priority threads can be scheduled to complete before the low-priority thread. An application can set the priority size of a thread by using the method setpriority (int) in the thread.

There are 5 basic operations of threads:

Derivation: A thread derives from within a process, it can be derived from a process, or it can be derived from a thread.
Blocking (block): If a thread needs to wait for an event to occur during execution, it is blocked.
Active (unblock): If an event that blocks a thread occurs, the thread is activated and enters the ready queue.
Dispatch (Schedule): Select a ready thread to enter execution state.
Finish: If a thread finishes executing, its register context and stack contents will be freed.


Figure 2 Status and operation of the thread
Another execution attribute of the thread is synchronization. The synchronization control mechanism used in the thread is the same as the synchronization control mechanism used in the process.

vii. Classification of threads

There are two basic types of threads:

User-level threads: The management process is completed by the user program, the core operating system only to manage the process.

System-Level threads (core-level threads): managed by the operating system kernel. The operating system kernel provides the application with the appropriate system call and application interface APIs so that the user program can create, execute, and undo threads.

attached: Examples of threads

1. SUN Solaris 2.3

Solaris supports kernel threads, light weights processes, and user threads. A process can have a large number of user threads; a large number of user threads reuse a small amount of light weight processes, and the light weight process corresponds to kernel thread one by one.
User-level threads need "bundle (bound)" on a LWP when they invoke core services, such as file reads and writes. A permanent bundle (a LWP fixed by a user-level thread, which is moved outside the LWP pool) and temporarily bundled (an unoccupied LWP is temporarily allocated from the LWP pool).
When a system service is invoked, if all LWP are already occupied (bundled) by another user-level thread, the thread blocks until there is an available LWP.
If LWP is blocked while executing a system thread (such as a read () call), the user-level thread currently bundled on the LWP is blocked.

Figure 3 Relationship between user threads, lightweight processes, and core threads

 ¨        related C-Library functions
       /* Create user-level threads              */
    int Thr_ Create (void *stack_base, size_t stack_size,
    void * (*start_routine) (void *), void *arg, long flags,
    thread_t *new_thread_id); 
  where flags include: Thr_bound (permanent bundle), Thr_new_ LWP (creates a new LWP into the LWP pool), if both are specified, create two new LWP, one permanent bundle and the other into the LWP pool.
 ²        related system calls
  /* Create lwp    in the current process   */
  int _lwp_create (ucontext_t *contextp, unsigned long flags,

lwpid_t *new_lwp_id);
/* Construct LWP Context * *
void _lwp_makecontext (ucontext_t *ucp,
void (*start_routine) (void *), void *arg,
void *private, caddr_t stack_base, size_t stack_size);
/* Note: System calls without a "bundle" Operation * *

2. Windows NT
The context of the NT thread includes registers, core stacks, thread environment blocks, and user stacks.
NT Thread State
(1) Ready state: The process has obtained the required resources outside the processor and waits for execution.
(2) Standby Status: The execution object for a particular processor, in which only one thread in the standby state is available on each processor in the system.
(3) Running state: The completion of the description table switch, the thread into the running state, until the kernel preemptive, time out, thread termination or wait state.
(4) Wait state: The thread waits for the object handle to synchronize its execution. At the end of the wait, enter the running, ready state according to priority.
(5) Conversion state: The thread enters the transition state when the thread is ready to execute and its kernel stack is on the external memory, and the thread enters the ready state when its kernel stack is recalled to the RAM.
(6) Terminate state: The thread executes to the end state; if the body has a pointer to the thread object, it can reinitialize the thread object and use it again.


Figure 4 The state of the Windows NT thread


The API for NT threads

The CreateThread () function creates a thread on the address space of the calling process to execute the specified function, and the return value is the handle of the created thread.
The ExitThread () function ends this thread.
The SuspendThread () function suspends the specified thread.
The ResumeThread () function decrements the suspended count of the specified thread, and the suspend count is 0 o'clock, and the thread resumes execution.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.