1-3 java concurrency and Multithreading basics

Source: Internet
Author: User

1. Introduction to Concurrency and multithreading

The original computer was single-tasking and later developed to run multiple tasks (processes) in parallel, scheduled by the operating system, and each task could get a time slice. Under multitasking, each task needs to release resources to other tasks after the system resources have been used.

Later, the same task internal development of multiple threads concurrent operations, the same memory space for concurrent read and write operations. The advent of a more modern computer with multicore CPUs also means that different threads can be executed in parallel by different CPU cores in a truly meaningful way. Some of the problems that arise in multi-threading are similar to the presence of multitasking and distributed systems, so the series will refer to multitasking and distributed system aspects, so it is called concurrency.

2. Advantages of Multithreading

    • Better resource utilization
    • Programming is easier in some cases
    • Faster program Response

Better resource utilization

Imagine a scenario in which an application needs to read and process files from a local file system. For example, it takes 5 seconds to read a file from disk and 2 seconds to process a file. Processing two files requires:

1 5秒读取文件A
2 2秒处理文件A
3 5秒读取文件B
4 2秒处理文件B
5 ---------------------
6 总共需要14

When reading files from disk, most of the CPU time is used to wait for the disk to read the data. During this time, the CPU is very idle. It can do something else. By changing the order of operations, you can use CPU resources better. Look at the following order:

1 5秒读取文件A
2 5秒读取文件B + 2秒处理文件A
3 2秒处理文件B
4 ---------------------
5 总共需要12

The CPU waits for the first file to be read out. It then starts reading the second file. When the second file is read, the CPU processes the first file. Remember that the CPU is idle most of the time while waiting for the disk to read the file.

Generally speaking, the CPU can do something else while waiting for IO. This is not necessarily disk IO. It can also be a network IO, or user input. Typically, network and disk IO are much slower than CPU and memory IO.

Easier programming

In a single-threaded application, if you want to write a program to manually handle the read and processing sequences mentioned above, you must record the status of each file read and processed. Instead, you can start two threads, each of which handles reading and manipulating a file. The thread is blocked while waiting for the disk to read the file. While waiting, other threads are able to use the CPU to process files that have already been read. As a result, the disk is always busy reading different files into memory. This can lead to increased disk and CPU utilization. And each thread only needs to log one file, so this is also easy to implement programmatically.

Faster program Response

Another common purpose of turning a single-threaded application into a multithreaded application is to implement a faster-responding application. Imagine a server application that listens for incoming requests on a certain port. When a request arrives, it handles the request and then returns to the listener.

The process of the server is described below:

1 while(server is active){
2   listen forrequest
3     process request
4 }

If a request takes a significant amount of time to process, the new client cannot send the request to the server during this time period. Requests can be received only when the server is listening. Another design is that the listener thread passes the request to the worker thread, and then immediately returns to the listener. The worker thread is able to handle the request and send a reply to the client. This design is described as follows:

1 while(server is active){
2     listen forrequest
3     hand request to worker thread
4 }

In this way, the service thread quickly returns to the listener. As a result, more clients can send requests to the server. This service also becomes more responsive.

The same is true for desktop applications. If you click on a button to start running a time-consuming task that is both performing the task and updating the window and button, the application does not seem to react as it does during the task execution. Instead, tasks can be passed to worker threads (Word thread). When a worker thread is busy processing a task, the window thread is free to respond to requests from other users. When the worker thread finishes the task, it sends a signal to the window thread. The window thread can update the application window and display the results of the task. For the user, this program with worker threading design appears to respond faster.

3. The cost of multithreading

From a single-threaded application to a multithreaded application is not just a benefit, it also has some cost. Don't use multithreading just to use multithreading. It should be clear that multithreading is used when multiple threads can be used more often than the cost of paying. If in doubt, try to measure the performance and responsiveness of your application, not just guesswork.

More complex design

While there are several multithreaded applications that are simpler than single-threaded applications, others are generally more complex. This part of the code requires special attention when accessing shared data in multi-threaded access. The interaction between threads is often very complex. Incorrect thread synchronization produces errors that are very difficult to discover and reproduce to fix.

The cost of context switching

When the CPU switches from executing a thread to executing another thread, it needs to store the local data of the current thread, the program pointer, and so on, then load the local data of the other thread, the program pointer, and so on, before it starts executing. This switchover is called a "context switch". The CPU executes one thread in one context and then switches to another in the context of executing another thread.

Context switching is not cheap. If it is not necessary, you should reduce the occurrence of context switches.

Increase resource consumption

Threads need to get some resources from the computer while they are running. In addition to the CPU, the thread also needs some memory to maintain its local stack. It also needs to occupy some resources in the operating system to manage threads. We can try to write a program that lets it create 100 threads, and these threads do nothing, just wait and see how much memory the program consumes when it runs.

Reprinted from the Concurrent Programming network –ifeve.com This article link address: Multi-Threading advantages

1-3 java concurrency and Multithreading basics

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.