A large number of concurrent emergency solutions and practices 2-Use Cache

Source: Internet
Author: User

A large number of concurrent emergency solutions and practices 2-Use Cache
The method provided by the large number of concurrent emergency response solutions and practices can only be used as a reference for emergency response. It is not recommended for use at the beginning of design, at the beginning of the design, we should adopt a more reasonable architecture to avoid the problems mentioned in this article, for more information about the architecture, see my other article "How to Improve the Performance of large websites".
Resources can be divided into two types: one is to prohibit concurrent access to resources, and the other is to allow concurrent access to resources. Resources that prohibit concurrent access, such as highway toll stations, each toll port processes only one pass request at a time. When multiple vehicle requests pass, there are two solutions, one is to add a charging port, and the other is to wait in queue for the vehicle, that is, not to process the request immediately, so that the request waits for idle resources. I call this solution backward asynchronous processing, requests to be processed will be processed after a period of time. This method is applicable to situations where users do not need to see the results. For the "back-to-asynchronous processing" example, refer to "a large number of concurrent emergency handling solutions and practices 1-asynchronous processing". Resources that can be accessed concurrently, such as malls, can be purchased by any customer at any time. In most of the previous malls, there was a counter before the products, and the customer purchased the required products through the sales staff, when the number of customers increases, the sales staff will be overwhelmed. It is a way for customers to queue up, but no mall will force customers to queue up to buy goods, this kind of "back-to-end asynchronous processing" is not suitable for solving such problems, so a better solution-"Supermarket", where some commodities are directly placed on the vending platform, let the customer take the goods by themselves, and then arrange a service personnel to observe at any time, and find that the goods are incomplete and then get the corresponding goods in the warehouse for Supplement.
The following process is coming soon:
Customer-> salesclerk 1-> product 1-> payment
Customer-> salesperson 2-> product 2-> payment
Customer-> salesperson 3-> product 3-> payment
Transformed
Warehouse-> waiter-> vending station (cache) (Forward asynchronous processing)
Customers-> Products (1, 2, 3)-> queue payment (backward asynchronous processing)
This is a typical example. Customers do not need to respond to payment requests in real time. We can adopt backward asynchronous processing. However, the sooner the request for viewing the product is met, the better. Therefore, we adopt another method of forward asynchronous processing to put the product (data) in advance to the vending server (cache, in this way, the customer can directly take away the request. Here is an example you may encounter:
One day, your boss suddenly received a call, and the customer complained that when the number of logon users increased, the page was very slow. Then the boss found you and you tried your best to explain to the boss, the boss looked at you with helpless eyes and said to you, "the user is in a hurry. It is up to you. "
You are frustrated to go back to your seat and start troubleshooting a little bit. At this time, you find that these pages are opened with a large number of database operations,
This is why the page opening speed is slow. If this problem occurs, a simple and fast transformation method is to use the cache.
The following structure is used,
User-> page and logic-> Database
Transformed:
Database-> Monitoring Program-> cache (Forward asynchronous processing)
User-> page and logic-> Cache
 

For example, you find that the page test. jsp to be accelerated is obtained through the testdb. Java class.

Public class testdb {
...
// Read database 1
String sql1 = "..."; // query statement
List tmplist1 = querybysql (sql1); // querybysql method, execute the SQL statement, and put the execution result into a list to return
...
// Read database 2
String sql2 = "..."; // query statement
List tmplist2 = querybysql (sql2); // querybysql method, execute the SQL statement, and put the execution result into a list to return
...
// Read database 3
String sql3 = "..."; // query statement
List tmplist3 = querybysql (sql3); // querybysql method, execute the SQL statement, and put the execution result into a list to return
...
}

 

First, we create a cache module (such as buffer) corresponding to the page ). Create a corresponding class testbuffer. Java for the page for data caching. We must ensure the one-to-one correspondence between the data in testbuffer. Java and the corresponding data in the database. Therefore, we design testbuffer. Java as a singleton.

# Buffer
-- Testbuffer. Java

 

Public class testbuffer {
Private volatile static testbuffer Singleton = NULL;
Private testbuffer (){}

Public static testbuffer getinstance ()
{

If (Singleton = NULL ){
Synchronized (testbuffer. Class)
{
Singleton = new testbuffer ();
}
}

Return Singleton;
}

}

 

We analyze the database operation modules on the pages to accelerate and find all database operation points. For example, we analyze testdb. Java and find the following database:
Operation Point: Read database 1, read database 2, read database 3. The three operation points return three List objects, which are the data to be cached. We add these three objects to testbuffer. java.

Public class testbuffer {
Private volatile static testbuffer Singleton = NULL;
Private testbuffer (){}

Private Static list tmplist1 = NULL
Private Static list tmplist2 = NULL
Private Static list tmplist3 = NULL

 

Public static testbuffer getinstance ()
{

If (Singleton = NULL ){
Synchronized (testbuffer. Class)
{
Singleton = new testbuffer ();
}
}

Return Singleton;
}

Public list gettmplist1 (){
Return tmplist1;
}
 
Public list settmplist1 (list tmplist1 ){
This. tmplist1 = tmplist1;
}

Public list gettmplist2 (){
Return tmplist2;
}
 
Public list settmplist2 (list tmplist2 ){
This. tmplist2 = tmplist2;
}

Public list gettmplist3 (){
Return tmplist3;
}
 
Public list settmplist3 (list tmplist3 ){
This. tmplist3 = tmplist3;
}

}

 

Rebuild the original database access module testdb. java. The changes are as follows:

Public class testdb {
...
// String sql1 = "..."; // query statement
// String sql2 = "..."; // query statement
// String sql3 = "..."; // query statement
// List tmplist1 = querybysql (sql1); // querybysql method, execute the SQL statement, and put the execution result into a list to return
// List tmplist2 = querybysql (sql2); // querybysql method, execute the SQL statement, and put the execution result into a list to return
// List tmplist3 = querybysql (sql3); // querybysql method, execute the SQL statement, and put the execution result into a list to return
 
List tmplist1 = testbuffer. gettmplist1 ();
List tmplist2 = testbuffer. gettmplist2 ();
List tmplist3 = testbuffer. gettmplist3 ();

...

}

 

Then we add the forward asynchronous processing module from the database to the cache, that is, we implement a monitoring program that runs in advance and put the data in the database into the cache, and keep the cache synchronized with the data in the database. We can use the thread.

Public class testthread implements runnable {

Private Static long interval = 3000; // cycle Interval

@ Override
Public void run (){
While (true ){
...

String sql1 = "..."; // query statement
String sql2 = "..."; // query statement
String sql3 = "..."; // query statement
List tmplist1 = querybysql (sql1 );
Testbuffer. settmplist1 (tmplist1 );

 

List tmplist2 = querybysql (sql2 );
Testbuffer. settmplist2 (tmplist2 );

 

List tmplist3 = querybysql (sql3 );
Testbuffer. settmplist3 (tmplist3 );
...

Try {
Thread. Sleep (interval );
} Catch (interruptedexception e ){
// Todo auto-generated Catch Block
E. printstacktrace ();
}

}

}

}

 

The methods provided above only serve as a reference when you encounter similar problems. They are a technical method that is constantly changing, and the principle is inseparable. The rational use of cache and asynchronous processing are the core of solving such problems.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.