22 criteria for performance tuning of large-scale and high-traffic Internet sites

Source: Internet
Author: User
Tags website performance

The 14 criteria for large-scale and high-traffic performance tuning of Web websites have become the standards for front-end Optimization of Web websites in the industry. Many articles and books at home and abroad have introduced these standards. The 14 Guidelines are actually one of the achievements of Yahoo performance in the United States over the past few years. They have also studied and proposed many effective website performance tuning technologies. The U.S. performance team is responsible for improving Yahoo products and applications faster, better, and more efficient.

1. Make fewer HTTP requests

(Minimize the number of HTTP requests)

One of the first challenges is to integrate all JavaScript and CSSPut it in one file, or split it into multiple files?

From the perspective of reducing network requests, the former is better than the latter. But from the perspective of parallelism, ieBy default, both Firefox and Firefox can only request two resources from one domain at the same time. this will bring a poor user experience in many cases-you must download all the files before you can see a decent page. flickr adopts a compromise-JavaScript and CSSDivided into multiple sub-files. This brings complexity in development, but the performance gains are huge.

2. Use a content delivery network

(Use CDN)

3. Add an Expires header

(Add the expiration time in the downloaded CSS, JS, and image components)

4. Gzip Components

(Compress the downloaded component)

There is no doubt that the compression of site content is a common web optimization method, but it may not always achieve the desired effect. The reason is that the mod-gzip module not only consumes the server CPUResources also consume client CPUResources. In addition, the temporary files created after the mod_gzip file is compressed are put on the disk, which also gives the disk IoIt brings about serious problems.

The mod_deflate module supported by httpd 2.x and later is used for the compression operations. mod_deflate is not available in httpd 1.x, but you can create a ramDisk to indirectly improve the performance.

Of course, mod_gzip is not useless. It is good for pre-compressed files. in addition, when using compression, you should also pay attention to the policy. there is no need to compress image files (there are a lot of Flickr images, and compression is not good ). only JavaScript and CSSThe new version of mod_gzip can automatically process pre-compressed files by configuring the mod_gzip_update_static option. Cal also points out that this feature may cause problems in earlier versions of browsers.

Another major means of compression is content compression. for JavaScript, you can use tips such as reducing comments, combining spaces, and using compact syntaxes (all Google scripts are very hard to read and compact, with similar ideas ). of course, the Javascript processed in this way may contain a lot of parentheses that are not easy to parse. Flickr uses the dojo compressor to build the parsing tree. Dojo compressor has low overhead and is transparent to end users. the JavaScript processing method has been introduced, while CSS processing is relatively simple. A simple regular expression is used to replace multiple spaces with a single space character. A maximum compression ratio of 50% is obtained.

5. Put CSS components at the top of the page.

(Place CSS files at the top of the page as much as possible)

6. Put JS components as close to the bottom of the page as possible.

(Place JS files at the bottom of the page as much as possible)

7. Avoid CSS expressions

(Use expressions with caution in CSS files)

8. Make JavaScript and CSS external

(Include JS and CSS files externally)

9. Reduce DNS lookups

(Reduce the number of domain name resolution requests)

10. Minify Javascript

(JavaScript code compression)

11. Avoid doing redirects.

(Avoid redirection)

12. Remove duplicates scripts

(Avoid repeated JS files)

13. Configure etags

(Configure etag)

Developers of Flickr make full use of the etag and last-modified mechanisms defined in the HTTP 1.1 Standard to Improve the caching efficiency. it is worth noting that Cal introduces an e-tag tips for Server Load balancer. you can set Apache to get the e-tag through the file adjustment time and file size. By default, Apache obtains the e-tag through the file node. Of course, this is not perfect, because it will affect if-modified-since.

However, some website e-tags, such as Yahoo, are generated based on nodes. The e-tag of the same CSS or JS script on different node servers is different. Therefore, if there are n servers, the probability that the browser will receive 304 of the response messages is 1/N.

14. Make Ajax cacheable

(Cache Ajax requests)

The following are the new principles.

15. Flush the header

(First send the information in the header)

We improved the page load times by flushing the Apache output buffer after the document head was generated. This had two benefits.

First, the head contains script and link tags for scripts and stylesheets. by flushing the head, those tags are pinned Ed and parsed by the browser sooner, and in turn the browser starts downloading Those components earlier.

Second, the head is flushed before actually generating the search results. This is a win for any property doing a significant backend computation or especially making one or more backend Web Service CILS.

16. Split static content into SS multiple hostnames

(Split large static files into requests in different domains)

If you have failed (10 or more) components downloaded from a single hostname, it might be better to split those into SS two hostnames.

17. Reduce the size of cookies

(Do not make the cookie content too large)

Reduce the amount of data in the cookie by storing state information on the backend, and abbreviating names and values stored in the cookie. set expiration dates on your cookies, and make them as short as possible.

18. Host static content on a different top-level domain

(Place static files under different top-level domain names)

19. Minify CSS

(CSS code compression)

20. Use get for xhr

(Use GET requests when xhr is available)

Iain lamb did a deep study of how using post for xmlhttprequests is inefficient, especially in IE. his recommendation: "If the amount of data you have to send to the server is small (less than 2 K ), I suggest you design your WebService/client application to use get rather than post.

21. Avoid iframes

(Try to avoid using IFRAME)

Don't use SRC (set it via JS instead). Each IFRAME takes 20-50 ms, even if it contains nothing

22. Optimize images

(Picture optimization)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.