A large number of concurrent emergency processing schemes and practices 2--using caching
The method provided by a large number of concurrent contingency management programs and practices, can only do emergency when a reference, not recommended at the beginning of design use, the design of the beginning we should adopt a more reasonable structure to avoid the problem mentioned in the article, about the framework please refer to my another article "Open Restaurant and do software- How to improve the performance of large Web sites.
Resources can be divided into two types, one for a resource that is blocked from concurrent access, and another for a resource that allows concurrent access. Resources that prohibit concurrent access, such as highway toll stations, at one time, each toll is processed only one pass request, when a number of vehicles are requested to pass, there are two solutions, one is to add a toll, the other is to let the vehicle waiting in line, that is not immediately processing requests, so that the request waiting for free resources, I refer to this workaround as a back-end asynchronous process, which will be deferred for a period of time, and this approach applies to situations where the user does not need to see the effect. For an example of "post-asynchronous processing", refer to a large number of concurrent contingency management schemes and practices 1--asynchronous processing. Allow concurrent access to resources such as shopping malls, any customer can buy the goods they need at any time, most of the previous shopping malls have a front desk, customers buy the goods through the salesperson, when the customer increase, the salesman will be overwhelmed, let the customer queue is a way, but no market will force customers to queue to buy goods , this "asynchronous processing" is not suitable for solving such problems, so a better solution is produced-"supermarkets", take out some of the merchandise directly placed on the vending table, so that customers to take, and then only need to arrange a service personnel at any time to observe, found that the goods are not all to the Treasury to take the corresponding goods to supplement.
The following process will be followed:
Customer-> salesman 1-> product 1-> pay
Customer-> salesman 2-> product 2-> pay
Customer-> salesman 3-> product 3-> pay
Transformed into
Warehouse-> Waiter-> Vending Table (cache) (Forward asynchronous processing)
Customer-> Goods (1, 2, 3)-> queuing (back to asynchronous processing)
This is a typical example where a customer does not need an immediate response to a paid request, and we can adopt a method of back-asynchronous processing. But the sooner the request to see the product is satisfied the better, so we adopt another method of forward asynchronous processing, put the goods (data) in advance to the vending table (cache), so that when the customer (request) can be directly removed when needed. Here's a look at an example you might encounter:
One day your boss suddenly received a phone call, customers complained that when the number of users increased, the page opened very slowly, when the boss to find you, you make every effort to explain to the boss, the boss with helpless eyes looking at you, said to you: "The user is very anxious, all depend on you." "
You go back to your seat and start a little bit of troubleshooting, and you find that these pages, when opened, are accompanied by a lot of database operations,
This is why the page is slow to open. In the case of such problems, a simpler and faster way to transform is to use caching.
That is, the following structure,
User-> page and logical-> database
Transformed into:
Database-> Monitoring Program-> cache (forward asynchronous processing)
User-> page and logical-> cache
For example, you find that you need to accelerate the page test.jsp, which gets the data through the Testdb.java class.
public class testdb{
...
Read Database 1
String SQL1 = "..."; Query statement
List tmplist1 = Querybysql (SQL1); Querybysql method, execute the SQL statement, and put the result in a list to return
...
Read Database 2
String sql2 = "..."; Query statement
List tmplist2 = Querybysql (SQL2); Querybysql method, execute the SQL statement, and put the result in a list to return
...
Read Database 3
String sql3 = "..."; Query statement
List tmplist3 = Querybysql (SQL3); Querybysql method, execute the SQL statement, and put the result in a list to return
...
}
First, we create a cache module that corresponds to the page (for example, buffer). Create a corresponding class Testbuffer.java to store the data for the page that needs to cache the data. The data in Testbuffer.java must be guaranteed to correspond to one by one of the corresponding data in the database, so we design Testbuffer.java as a single example.
#Buffer
--Testbuffer.java
public class testbuffer{
Private volatile static Testbuffer singleton=null;
Private Testbuffer () {}
public static Testbuffer getinstance ()
{
if (singleton==null) {
Synchronized (Testbuffer.class)
{
Singleton=new Testbuffer ();
}
}
return singleton;
}
}
We analyze the database operation module that needs to be accelerated and find all the database operating points. For example, we analyze Testdb.java and find the following database
Operation Point: Read database 1, read database 2, read database 3. These three operation points return three list objects, and the three list objects are the data we want to cache. We add these three objects to the Testbuffer.java.
public class testbuffer{
Private volatile static Testbuffer singleton=null;
Private Testbuffer () {}
private static List Tmplist1 = null
private static List Tmplist2 = null
private static List Tmplist3 = null
public static Testbuffer getinstance ()
{
if (singleton==null) {
Synchronized (Testbuffer.class)
{
Singleton=new Testbuffer ();
}
}
return singleton;
}
public List GetTmplist1 () {
return tmplist1;
}
Public List SetTmplist1 (list tmplist1) {
this.tmplist1 = Tmplist1;
}
public List GetTmplist2 () {
return tmplist2;
}
public List SetTmplist2 (list tmplist2) {
this.tmplist2 = Tmplist2;
}
Public List GetTmplist3 () {
return tmplist3;
}
Public list setTmplist3 (list tmplist3) {
This.tmplist3 = Tmplist3;
}
}
Re-transform the original database access module Testdb.java, change as follows:
public class testdb{
...
String SQL1 = "..."; Query statement
String sql2 = "..."; Query statement
String sql3 = "..."; Query statement
List tmplist1 = Querybysql (SQL1); Querybysql method, execute the SQL statement, and put the result in a list to return
List tmplist2 = Querybysql (SQL2); Querybysql method, execute the SQL statement, and put the result in a list to return
List tmplist3 = Querybysql (SQL3); Querybysql method, execute the SQL statement, and put the result in a list to return
List tmplist1 = Testbuffer.gettmplist1 ();
List tmplist2 = Testbuffer.gettmplist2 ();
List tmplist3 = Testbuffer.gettmplist3 ();
...
}
Then we add the forward asynchronous processing module from database to cache, that is, we implement a proactive monitoring program, put the data in the database into the cache, and keep the cache synchronized with the data in the database, we can use the thread implementation.
public class Testthread implements runnable{
private static long interval = 3000; Cycle interval
@Override
public void Run () {
while (true) {
...
String SQL1 = "..."; Query statement
String sql2 = "..."; Query statement
String sql3 = "..."; Query statement
List tmplist1 = Querybysql (SQL1);
Testbuffer.settmplist1 (TMPLIST1);
List tmplist2 = Querybysql (SQL2);
Testbuffer.settmplist2 (TMPLIST2);
List tmplist3 = Querybysql (SQL3);
Testbuffer.settmplist3 (TMPLIST3);
...
try {
Thread.Sleep (interval);
catch (Interruptedexception e) {
TODO auto-generated Catch block
E.printstacktrace ();
}
}
}
}
The methods provided above only do you have a reference to a similar problem, is the operation, the method of changing and the reason is not the case, the rational use of caching and asynchronous processing is the core to solve this problem.