Ajax performance optimization methods without refreshing paging _ajax related

Source: Internet
Author: User
Tags time interval

Ajax No refresh paging, is already a familiar thing, is probably the Web front-end page has a JS method, through Ajax to request the server side of the paging data interface, get the data and then create HTML structure on the page, showing to the user, similar to the following:

<script type= "Text/javascript" >
function GetPage (pageIndex) {
ajax ({
URL: "remoteinterface.cgi", Method
: ' Get ',
Data:{pageindex:pageindex},
callback:callback
};
}
Function Callback (DataList) {
//todo: Creates an HTML structure that is presented to the user based on the returned DataList data.
}
</script>

Where the remoteinterface.cgi is a server-side interface. We are confined to space, and the instance code involved may not be complete, just to make the meaning clear.

The UI, there may be a variety of styles of pagination controls, we are more familiar with, for example:

But it's just that the user clicks on the control to trigger the GetPage (PageIndex) method, which may not be as simple as this getpage method.

If you follow the code fragment 1, we can imagine that each time the user clicks the page, they will request a remoteinterface.cgi, ignoring the data may be updated, except for the first time, after each getpage (1), GetPage (2), GetPage (3) and so on the trigger of the remote interface request and the network round-trip data traffic, in fact, are repetitive, unnecessary. You can cache this data in some form on the page the first time you request it. If the user is interested in looking back over the page, the GetPage method should first check whether the local cache contains the page data, and if so, then simply show it back to the user instead of calling the remote interface. With this in mind, we can modify the code fragment 1 as follows:

<script type= "Text/javascript" >
var pagedatalist={};
function GetPage (pageIndex) {if
(Pagedatalist[pageindex]) {//If the local data list contains the data for the currently requested page number
ShowPage (pagedatalist[ PageIndex]//Direct display of current data
}
else
{
ajax ({
URL: ' remoteinterface.cgi ', method
: ' Get ',
Data:{pageindex:pageindex},
callback:callback
});
}
function callback (pageindex,datalist) {
pagedatalist[pageindex]= DataList;//Cached Data
ShowPage (DataList);/ Presentation Data
}
function ShowPage (DataList) {
//todo: Creates an HTML structure that is presented to the user based on the returned DataList data.
}

</script>

This is to save the time of network request round-trip, more important is to save valuable network traffic and reduce the burden of interface server. In low speed environment or interface server operating pressure is already relatively large, this necessary improvement can show obvious optimization effect. The famous Yahoo 34 article, the first is to minimize the number of HTTP requests. Ajax asynchronous requests are undoubtedly within the scope of HTTP requests. Web Apps with a small amount of traffic may not feel necessary, but imagine a page that has 10 million visits a day, and the average user turns 5 pages, one of which is a repeat view. So such a page, according to code fragment 1, the average will trigger 50 million times a day of data requests, and according to the Code fragment 2, the average daily reduction can be reduced by at least 10 million requests. If the amount of data per request is 20KB, you can save 10 million *20kb=200,000,000kb about 190G of network traffic. In this way, the resources saved are quite considerable.

If you want to delve further, the data caching method in code fragment 2 is worth discussing. We have previously assumed that you can ignore the timeliness of paging data, but the actual application of timeliness is an unavoidable problem. Caching will undoubtedly lead to a reduction in timeliness, and the real caching scheme should also rely on the analysis and trade-offs of application timeliness requirements.

For the general not special emphasis on timeliness of content, the cache on the page should still be acceptable, one user will not always stay on a page, the page has a jump caused by the reload, you can get updated data. In addition, if the user has the habit of refreshing the page, when he particularly want to see whether the list has data updates, you can choose to refresh the page. If you're looking for perfection, consider setting a time range, like 5 minutes. If the user stays on the current page for more than 5 minutes, then in 5 minutes his page is read the cache, 5 minutes later to request the server's data again.

In some cases, if we can predict the update frequency of the data, such as how many geniuses may have a data update, or even consider using local storage, at a certain time to trigger a request for server data, so that the number of requests and traffic savings is more thorough. Of course, ultimately, what kind of caching methods apply, in the final analysis depends on the timeliness of the product requirements, but the principle can save the request and flow, as far as possible to save, for a large number of visits to the page is particularly so.

For a class of data that requires high timeliness, does caching not apply at all? Of course not, but the whole idea has to change again. Generally, the so-called change, may be mainly in the list of data has been increased, reduced or changed, but most of the data is still unchanged. In most cases, the previous settings are cached for a period of time.

If there is a need to update data in real time, it may be easy to think of a timer, such as a GetPage (PageIndex) method every 20 seconds and redraw the list. But as long as you think of the previous 10 million page access to the assumption, you will find this is undoubtedly a super scary thing, according to this access volume and retry the frequency of the server pressure mountain. How to deal with this situation, I would like to ask you to look at Gmail, 163 mailbox and Sina mailbox and so on the mailing List page processing mode. They meet our previous assumptions almost at the same time: Super large day traffic, real-time updates to the data, and so on. Analysis with network grab tools It is not hard to find that they do not make requests to the server when users repeatedly request data for the same page number. To ensure that a message is updated to notify the user in a timely manner and to update the mailing list, a timed, repetitive asynchronous request can be used, but the request is simply to make a status query instead of refreshing the list. A request is made to obtain updated data only if a status is obtained with a message update, or the interface of the state query returns the updated data directly when it finds an update. In fact, 163 mailbox The timing of the status of the query, the interval is set relatively long, about two minutes, Sina mailbox This time interval longer, about 5 minutes, you can understand that they are trying to reduce the number of requests. But this kind of processing, may not only the front-end can do unilaterally, the implementation of the program needs and the background interface as a whole to consider the line.

Now let's look back at the data caching method in code fragment 2. Now no longer discuss the number of requests and traffic savings, let's look at the implementation of the front-end there is nothing to delve into it. The original data is stored and, when called Again, ShowPage (DataList) needs to reconstruct the HTML structure from the data to the user again, but we have done it before the process of creating the structure, as shown in the code fragment 2 schematic approach Is it possible to consider saving the structure directly when you first create the structure? This can reduce the repetition of JS calculation, especially when the structure is more complex and more worthy of consideration. Again, this structure has been created on the page before, when the pages are destroyed and again create a new structure, is also a resource-consuming, can not be the first time to create a good, the page is not destroyed, just by controlling the CSS style to hide it, When you repeat the page, you are only controlling each other to show or hide between these created structures?

Finally, the approach discussed here does not necessarily apply to all scenarios, but it may be somewhat enlightening to try one or two of them at the right time. At the same time, if the idea is divergent, or not only can be used in no refresh paging. Here, we discuss together.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.