From the planning, to the front-end development, and finally to the test and on-line, lasted 4 months, 5,173 first-page performance optimization Project finally successfully on-line, and achieved the expected performance optimization objectives. This project is not a revision, but the original home page design and function unchanged, only to do refactoring and optimization. Although the project is called the front-end performance optimization, but it is not only the front-end unilateral work, to complete the optimization well, it requires the full complement of the front and back.
Historical background
The old home should be 09 on the line, home is also the Department of Competition for Resources of the place, we all want to have a place on the home page, all departments in the home page have their own small tofu block, if there are new items on the line, most of the way to play patches, and the only criterion is to ensure the normal operation of functions, as regards performance, it is very remote. Bitter force is the developer, every time there is a home page of the changes are scary, afraid to change the problem of that again, historical reasons caused by more and more problems.
The first to see should be the front-end staff, because the home page in the continuous tinkering, the performance has been poor to the front end staff feel very embarrassed. But can not see it is just can't see, can not take substantive measures to improve, because it involves the interests of various departments, as said above, optimization is not only in front, so the front-end staff can only reflect the problem above. To this year, finally led also can't see, a leader in overseas visit our 818.com and 5173.com home page, compared to the former home quickly (interrupt, 818 front end developer is also my ^_^), the latter is very slow home page, the difference is large. So in the leadership of the promotion of attention, 5,173 first page of the front-end performance optimization project only after approval, developers can finally let go of bold toss.
Problem analysis
Before you make a plan, make a plan to solve the actual problem, first look at the old home page, this is my plan to collect the relevant data:
1, too many requests, including the number of CSS outside the chain of 12, JavaScript outside the chain of 41;
2, the total volume of the page is too large, many static resources are not added gzip, dynamic site JS even did not compress;
3, the resource occupation is serious, under the IE6 scroll the page the CPU occupancy rate as high as 80%, the memory leakage is also very serious;
4, advertising system, advertising pictures are document.write in the way of loading, page congestion is very serious;
5, the page has 7 iframe;
6, data source interface confusion;
7, the page loading speed is too slow, there is a white screen phenomenon, the first screen completed loading very slowly;
The data above is shocking to you, and it shows that there is a lot of room for optimization. Find out what the problem is, and then there is the specific direction of implementation. In short, whether it is a conventional method, or a trick, the goal is only three words, "fast, faster."
Specific implementation
Although the content of rough look at the page is not a lot, but specific to the function module, are very time-consuming and laborious. Our boss has a very classic word, "Usually I will ask the interviewer a question, if you are alone to do 5,173 first page of the front-end work, how long can it be completed? If the answer is only one weeks, either you have not read the page, or you are unprofessional. "I spent one months alone to complete the front end of the first page development work." Here is a concrete implementation.
Reconstruction of Html&css
There's no change in the design and functionality of the page, but the HTML page is still a complete refactoring, and I've also tried to use HTML5 's new tab to lay out the page. CSS refactoring is also a matter of course, there are 12 outside the chain of CSS, these are all to kill, some modules if more than the home page useful, it needs to be modular, the public files should be placed in the common file, in the development of the time can be divided into multiple modules, and then used @import to a file, Merge into one file when packing and publishing. This needs to grasp a good degree, that is, to reasonably use the cache, but also to avoid the single file too large, at the same time to do a modular.
The old home page has a lot of CSS selectors too long problem, may be a combination of CSS selector up to 6-7 is often the case, CSS selector too long is not only the impact of performance, but also because of the relationship between the value of CSS, the maintenance of the later brought a lot of trouble. Should use class to define the selector, plus the combination of tag Selector, up to 3 selector combination can meet the vast majority of requirements, in addition, in the writing of CSS prohibit the use of ID selector and!important, develop good CSS writing habits is very important.
The refactoring of JavaScript
JavaScript refactoring is too urgent, there are 41 JavaScript outside the chain of files, of course, these outside the chain also includes 15 ads outside the chain, the optimization of advertising I will later say, but there are still 26 outside the chain of JavaScript. These bloated JavaScript files are the cause of page loading jams, but also the system resources occupy the moth, this is one of the difficulties of the whole project.
The business logic of the four-level linkage search is combed, and the interactive function of the four-level linkage search is optimized to enhance the user experience. This module has more Ajax interaction, the largest JSON packet unexpectedly has 94.4KB, at this time reasonable use of the current page caching function ($.fn.data) is very important. The largest JSON packet is loaded after the page Dom is Ready, then the HTML code of the first screen is assembled and cached, the user selects the game by alphabetical index, then searches the loaded JSON packet for the appropriate data to assemble the HTML code, and then caches the HTML code for the index. If you then select the area, service, transaction type, and then to the server to fetch the corresponding JSON data, assembled into the HTML code for caching, this time caching only the last selection results.
Convenience Center is also the home page business logic is very complex module, involving a lot of Ajax interaction and the operation of the form. The table in each tab is based on the requested JSON data to generate the HTML structure, the original is the tab will request data every time, and then generate HTML structure, each switch tab to request, then generate, this is really two. The same data and structure once requested, and generated once, this duplication of operations is a blatant waste of resources. The module JavaScript originally requested the dynamic site of the file, did not do the cache did not do the compression, each page load is blocked here for a little while. The server side of the data source interface is also very messy, developers lack the concept of standardized data interface. There are so many problems here that I can't spit anymore. Finally, it is also a business logic to rearrange the pain of the egg, refactoring the code.
Delay loading, lifting the loading speed of the first screen
When a user opens a long Web page, the first-screen content is loaded with the most intuitive speed experience. Therefore, the first screen as soon as the completion of loading is also the user to measure whether the page is "fast" the most important factor. 5173 of the home page, the picture is basically concentrated in the following position, so that all of the following pictures are delayed loading can be as fast as possible to promote the first screen loading speed. The common picture delay loading technology presumably everyone will not be unfamiliar, here will not repeat. There are also a lot of pictures in the tab content, and they also have to be loaded when the tab menu is triggered. Add a fixed size to the picture in the HTML code and naturally no longer talk. So easy? No!
The picture is not only a picture of the business configuration, but also a picture from a third party ad system (including the first screen of the carousel large image is also this type). The URL to these ad images is a JavaScript link that contains code that uses document.write to load ad images. Some TAB contains the contents of the collaboration site that is embedded in the page using an IFRAME. Advertising pictures and IFrame are the culprits for blocking page loading.
The initial idea was to recreate an ad system that could be loaded in a different way, but the cost of development was too high. Finally think of the use of textarea to delay the loading of ads and IFRAME, Yuber provides this method is very useful. Textarae is a good thing, whether it's normal HTML code or CSS or JavaScript code, can be thrown inside to implement deferred loading. Advertising picture optimization is more troublesome, I have in another article has a detailed introduction. With textarea, many of the content can be implemented as a delay loading of the picture, the tab content in the IFRAME can also trigger the tab menu to load the IFRAME.
It is the odd tricks of these kinds of deferred loading content that maximize the loading speed of the first screen of the Web page. However, the side effects of deferred load content need to be explained, for some of the more important content, need to take into account the impact of SEO.
Service-Side optimization
The front-end can do basically all said, and then say the service-side optimization work. The original server provides the front-end data source from each site, the front-end needs to work with the various departments of developers, and they provide the data source in the performance is also relatively slow. After the consultation decided to aggregate the data sources to an intermediary server, the front-end unified from the intermediary server to fetch data, server-side communication between a certain amount of cache time, so that the data source to solve the problem of slow and not unified.
For the total volume of the page is too large, the code refactoring can indeed reduce a lot of volume, the other static resources are all to add gzip, just add gzip, bring the performance improvement is also more obvious.
It is also important to make reasonable use of browser-side caching, except for login information and the more time-sensitive request of the cookie, which adds a max-age expiration time to all requests that can be added to the Cache-control. about the browser-side cache add, here is a more detailed article cache them if you can. The addition of the cache can also cause trouble to the update, so there should be a way to clear the cache, to the static resource request plus timestamp.
The other side of the service is also bold to use the varnish as a cache accelerator server, which in the domestic large-scale web site should still be rare.
Optimize results
It's time to take a look at the results of the optimization, and look at a set of data comparisons:
Compare the number of requests before and after optimization:
The large number of requests to reduce the pressure on the server, you can remove a lot of servers.
The file volume comparison of static resources before and after optimization does not include other file volumes such as AJAX data:
From the comparison of file volume, after the optimization of 494KB download, if according to the daily PV1000000 (valuation, the actual value is much greater than the value, the actual value is inconvenient to disclose) to calculate, then daily can save traffic 470GB.
Optimization before and after the load time contrast, this is in the same network environment at the same time Test Taobao and Pat to compare, test software is based on IE9 WebWatch, each test is to clear the cache, a number of tests to get an average:
About the load speed analysis, Taobao and patted the first screen of more pictures, so the first screen speed, but the total load time is much longer, of course, their download volume is also much larger, 5173 of the first screen is more dom, download a lot of small, so the total time and the first screen time is quite close. The total download is the total download volume of the page is completed, because all of the use of delayed loading technology, scrolling down and there will be pictures loaded, these times are not counted.
In the end how to measure the speed of the loading of the Web page? This optimization I did not use YSlow and pagespeed test scores of software, but with the actual loading speed for the optimization of the target, the first screen loading speed is the most consistent with the actual description. If a website is open for half a day or screen, I believe most people will feel very slow. This is the actual experience, the measurement software is not reflected.
Article Source: Rainy Night with knives ' s Blog