Examples of Java multi-threading scenarios and application purposes

Source: Internet
Author: User

The main purpose of multithreaded use is to:

1, Throughput: You do the Web, the container to help you do multi-threading, but he can only help you do the request level. Simply put, it might be a request for a thread. or more requests for one thread. If it is a single thread, then only one user's request can be processed at the same time.

2, scalability: That is, you can increase the number of CPU cores to improve performance. If it is a single thread, then the program executes to the death of the use of the single core, there is certainly no way to increase the number of CPU cores to improve performance.

Given that you are doing the Web, the 1th may hardly involve you. I'll tell you the 2nd here.

--to give a simple example:
Suppose there is a request that the processing of this request server requires 3 very slow IO operations (such as database queries or file queries), then the normal order may be (the execution time is represented in parentheses):
A, read file 1 (10ms)
b, processing 1 of data (1ms)
C, read file 2 (10ms)
D, processing 2 of data (1ms)
E, read file 3 (10ms)
F, processing 3 of data (1ms)
g, integration of 1, 2, 3 data results (1MS)
A total of 34ms is required for a single thread.
If you are in this request, the AB, CD, EF separately to 3 threads to do, only need 12ms.

So multithreading is not not how to use, but, you are usually good at discovering some of the points can be optimized. Then evaluate if the scenario should be used.
Let's say the same question as above: But the execution time of each step is different.
A, read file 1 (1ms)
b, processing 1 of data (1ms)
C, read file 2 (1ms)
D, processing 2 of data (1ms)
E, read file 3 (28ms)
F, processing 3 of data (1ms)
g, integration of 1, 2, 3 data results (1MS)
A total of 34ms is required for a single thread.
If you still follow the above partitioning scheme (the above scenario and the barrel principle, the time-consuming depends on the slowest thread execution speed), in this example is the third thread, the execution of 29ms. So the last time the request is 30ms. It saves 4ms compared to a single thread. However, it is possible for a thread to schedule a switchover to take 1, 2ms. Therefore, this scheme appears to be not obvious superiority, but also brings the complexity of the program to improve. It's not worth it.

So now the optimization point, is not the first example of the task of splitting multi-threaded completion. Instead, it optimizes the read speed of file 3.
It is possible to adopt a cache and reduce some duplicate reads.
First, suppose there is a situation where all users request this request, which is actually equivalent to all users needing to read the file 3. Then you think, 100 people made this request, the equivalent of the time you spend reading this file is 28x100=2800ms. So, if you cache the file, as long as the first user's request read, the second user does not need to read, from within the access is very fast, probably 1ms.

Pseudo code:

Java Code?
1234567891011 publicclassMyServlet extendsServlet{    privatestatic Map<String, String> fileName2Data = newHashMap<String, String>();    privatevoidprocessFile3(String fName){        String data = fileName2Data.get(fName);        if(data==null){            data = readFromFile(fName);    //耗时28ms            fileName2Data.put(fName, data);        }        //process with data    }}


It looks like it's good. Create a map of the file name and data. If you read data that already exists in a map, you do not have to read the file.
The problem is that the servlet is concurrent, which leads to a very serious problem, a dead loop. Because, hashmap in concurrent modification, may be caused by the composition of the circular chain list!!! (specifically you can read the HashMap source code) If you do not touch too many threads, you may find that the server did not request the giant card, do not know what the situation!
OK, so with Concurrenthashmap, as his name is, he's a thread-safe hashmap, which makes it easy to solve problems.

Java Code?
1234567891011 publicclassMyServlet extendsServlet{    privatestatic ConcurrentHashMap<String, String> fileName2Data = newConcurrentHashMap<String, String>();    privatevoidprocessFile3(String fName){        String data = fileName2Data.get(fName);        if(data==null){            data = readFromFile(fName);    //耗时28ms            fileName2Data.put(fName, data);        }        //process with data    }}



Does this really solve the problem, so that while a user has access to file A, another user who wants to access file A will also take the data from the Filename2data, and will not cause a dead loop.

However, if you think this is over, then you think multithreading is too simple, SAO years!
You will find that when 1000 users first access the same file, they read the file 1000 times (this is the most extreme, probably only hundreds of). What the fuckin Hell!!!

Is the code wrong, I just live my life like this!

Analyze it well. The servlet is multithreaded, so

Java Code?
123456789101112 publicclassMyServlet extendsServlet{    privatestaticConcurrentHashMap<String, String> fileName2Data = newConcurrentHashMap<String, String>();    privatevoidprocessFile3(String fName){        String data = fileName2Data.get(fName);        //“偶然”-- 1000个线程同时到这里,同时发现data为null        if(data==null){            data = readFromFile(fName);    //耗时28ms            fileName2Data.put(fName, data);        }        //process with data    }}


The "accidental" comment above is entirely possible, so there is still a problem.

Therefore, you can simply encapsulate a task to handle.

Java Code?
1234567891011121314151617181920212223242526272829 publicclassMyServlet extendsServlet{    privatestaticConcurrentHashMap<String, FutureTask> fileName2Data = newConcurrentHashMap<String, FutureTask>();    privatestaticExecutorService exec = Executors.newCacheThreadPool();    privatevoidprocessFile3(String fName){        FutureTask data = fileName2Data.get(fName);        //“偶然”-- 1000个线程同时到这里,同时发现data为null        if(data==null){            data = newFutureTask(fName);            FutureTask old = fileName2Data.putIfAbsent(fName, data);            if(old==null){                data = old;            }else{                exec.execute(data);            }        }        String d = data.get();        //process with data    }        privateFutureTask newFutureTask(finalString file){        returnnew FutureTask(newCallable<String>(){            publicString call(){                returnreadFromFile(file);            }             privateString readFromFile(String file){return"";}        }    }}



All of the above code is directly in the BBS, not guaranteed to run directly.

The most multi-threaded scenario: the Web server itself; A variety of dedicated servers (such as game servers);
Common Application Scenarios for multithreading:
1, background tasks, such as: timed to a large number of (more than 100w) users to send mail;
2, asynchronous processing, such as: hair microblogging, record logs, etc.;
3. Distributed computing

Examples of Java multi-threading scenarios and application purposes

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.