Yahoo website performance optimization of the 34 rule

Source: Internet
Author: User
Tags imagemagick response code blank page browser cache subdomain website performance akamai technologies

1. Minimize the number of HTTP requests

In the end user response time, 80% is used to download the content, this part of the time includes the download page image, style sheet, script, flash and so on. Reducing the number of HTTP requests by decreasing the elements in the page can be a critical step in improving the speed of your Web pages.

The way to reduce page components is to simplify page design. So is there a way to maintain the richness of the content of the page and to speed up response time? Here are a few techniques to reduce the number of HTTP requests and possibly keep the page content rich.

1. Merging files: Merging files is a way to reduce HTTP requests by putting all the scripts in a file, such as simply putting all of the CSS files into a style sheet. When a script or style sheet is used in different pages, it may be relatively cumbersome to make different changes, but even so, this approach is an important step to improve the performance of the page;

2. CSS Sprites:css Sprites is an effective way to reduce image request. Put all the background image into a picture file, and then through the CSS Background-image and Background-position properties to display the different parts of the picture;

3, image map: Picture map is to integrate multiple images into a picture. Although the overall size of the file does not change, you can reduce the number of HTTP requests. Image maps can only be used when all parts of the picture are close together on the page, such as the navigation bar. Determining the coordinates of a picture may be cumbersome and error-prone, while using picture map navigation is not readable, so this method is not recommended;

4. Inline images: Inline images are used Data:url scheme to load the image data into the page, which may increase the size of the page. Dropping an inline image into a stylesheet (cacheable) can reduce HTTP requests while avoiding increasing the size of the paging file, but inline images are not yet supported by mainstream browsers.

Reducing the number of HTTP requests on your page is one of the first steps you need to make to optimize your site's performance, which is the most important way to improve your first-time access to user wait times. As Tenni Theurer's blog, Browser Cahe usage–exposed, HTTP requests took up 40% to 60% of the response time without caching. Improve HTTP requests so that people who visit your site for the first time get a faster experience!

2. Reduce the number of DNS lookups

The Domain Name System (DNS) provides the correspondence between the domain name and the IP, just like the name of the person in the phone book and their phone number. When you enter Www.52maomao.info in the browser address bar, the DNS resolution server returns the IP address of the domain name. The process of DNS parsing also takes time, and in general it takes between 20 and 120 milliseconds to return the IP address of a given domain name, and in this process the browser does nothing until DNS is found.

Caching DNS lookups can improve page performance. This cache requires a specific cache server, which typically belongs to the user's ISP provider or local LAN control, but it also generates a cache on the computer that the user is using. DNS information is maintained in the operating system's DNS cache (DNS Client Service in Microsoft Windows systems), and most browsers have their own caches that are independent of the operating system. Because the browser has its own cache record, it is not affected by the operating system in a single request.

Internet Explorer By default, the cache time for DNS lookup records is 30 minutes, and its key value in the registry is dnscachetimeout. Firefox's lookup record for DNS is cached for 1 minutes, and its option in the configuration file is network.dnscacheexpiration (fasterfox this option to 1 hours).

When the DNS cache in the client is empty (both the browser and the operating system are empty), the number of DNS lookups is the same as the number of host names in the page, including the host names contained in the page URLs, pictures, script files, style sheets, flash objects, and so on. Reducing the number of host names can reduce DNS lookups. Reducing the number of host names can also reduce the number of concurrent downloads in the page. Reducing the number of DNS lookups can save response time, but reducing parallel downloads increases response time. My guiding principle is to split the content of these pages into at least two parts, but not more than four, and this is a tradeoff between reducing the number of DNS lookups and maintaining a high degree of parallel downloads.

3. Avoid jumping

Jumps are implemented using the 301 and 302 codes. The following is an HTTP header with a response code of 301:

http/1.1 301 Moved permanentlylocation:http://52maomao.info/democontent-type:text/html

The browser points the user to the URL specified in the location. All information in the header file is required in a single jump, and the content section can be empty. Regardless of their name, the 301 and 302 responses are not cached unless you add an additional header option, such as expires or Cache-control, to specify it for caching.

<meat/> Element Refresh tags and javascript can also implement URL jumps, but if you have to jump, the best way is to use the standard 3XXHTTP status code, mainly to ensure that the "back" button can be used correctly.

But remember to skip the transfer to reduce the user experience. Adding a jump between the user and the HTML document will delay the display of all the elements in the page, because any files (images, flash, etc.) will not be downloaded until the HTML file is loaded.

There is a jump phenomenon that is often overlooked by web developers and often a waste of response time. This behavior occurs when the URL should have a slash (/) that is ignored. For example, when we want to access http://astrology.yahoo.com/astrology, we actually return a jump with a 301 code that points to http://astrology.yahoo.com/astrology/ (note the trailing slash). Alias or mod_rewrite or the Directoryslash can be used in Apache server to avoid.

Connecting to new sites and old sites is another case where jump functionality is often used. In this case, it is often to connect different content of the site and then jump based on the different types of users (such as browser type, user account type). It is easy to switch between two Web sites using jumps, and the amount of code required is not much. Although using this approach can reduce complexity for developers, it also reduces the user experience.

An alternative approach is to use alias and mod_rewrite and implementation if both are on the same server. If you are using a jump because of a different domain name, you can replace it by using alias or Mod_rewirte to create a CNAME (a DNS record that holds the relationship between a domain name and another domain name).

4. cacheable Ajax

One of the benefits that Ajax often mentions is the immediacy of feedback that is brought to users because of the asynchronous nature of the information they transmit from the backend server. However, using AJAX does not guarantee that users will not spend time waiting for asynchronous JavaScript and XML responses.

In many applications, whether the user needs to wait for a response depends on how Ajax is used. For example, in a web-based email client, users must wait for Ajax to return the results of a message query that matches their criteria. Remember that "async" does not smell "instant", which is important.

To improve performance, it is important to optimize the AJAX response. The most important way to improve the performance of AJXA is to make the response cacheable, and you can view the Add an Expires or a cache-control Header for a specific discussion. Several other rules apply to Ajax as well:

1, Gizp compressed files;

2, reduce the number of DNS lookups;

3, streamline JavaScript;

4, avoid jumping;

5, Configuration etags.

Let's take a look at an example: a Web2.0 email client uses Ajax to complete the download of a user's address book. If the user has not made any changes to the Address book since the last time the email Web application was used, and the AJAX response is cached through the expire or Cacke-control header, then it can be read directly from the last cache. You must tell the browser whether to use the address Book in the cache or send a new request. This can be achieved by adding a timestamp containing the last edit time for the Ajax URL that reads the address book, for example, &t=11900241612, and so on. If the address book has not been edited since the last download and the timestamp is unchanged, loading from the browser's cache reduces the HTTP request process. If the user modifies the Address book, the timestamp is used to determine that the new URL and cache response do not match, and the browser will request an update to the phonebook.

Even if your ajxa response is dynamically generated, even if it works for only one user, it should be cached, which makes your Web2.0 application faster.

5. Delay Loading Content

You can take a closer look at your Web page and ask yourself "what content is required to be loaded first when the page is rendered?" What content and structure can be loaded later?

JavaScript is an ideal choice for separating the entire process into two parts according to the OnLoad event. For example, if you have JavaScript that implements drag-and-drop and animation, it waits to be loaded later, because the drag-and-drop elements on the page occur after the initialization is rendered. Other content such as hidden parts (content that appears after the user's actions) and images that are in the collapsed section can also be delayed loading

Tools can save your workload: Yui image Loader can help you to postpone loading the collapsed parts of the picture, Yui Get utility is a convenient way to include JS and CSS, such as you can open the Firebug net tab to see the homepage of Yahoo.

When performance targets are aligned with other site development practices, they complement each other. In this case, the way the program improves website performance tells us that, in the case of JavaScript support, you can get rid of the user experience first, but this ensures that your site will work without JavaScript. After you have determined that the page is working, load the script to achieve more fancy effects such as drag-and-drop and animations.

6. Pre-loading

Preloading and post-loading may seem to be the opposite, but preloading is actually a way to achieve another goal. Preloading is a page content (like, style sheet, and script) that is requested to be used in the future when the browser is idle. Using this method, when the user wants to access the next page, most of the content in the page has already been loaded into the cache, so the access speed can be greatly improved.

Several preload methods are provided below:

Unconditional Loading: Loads additional page content directly when the onload event is triggered. Take google.com, for example, to see how its spirit image is loaded in onload. This spirit image is not required on the Google.com home page, but it can be used in search results pages.

Conditional loading: According to the user's actions to determine the user may go to the following page and the corresponding pre-loading page content. In search.yahoo.com you can see how to load additional page content when you enter content.

Expected loading: Use preload when loading redesigned pages. This often happens when the page is redesigned and users complain that "the new page looks cool, but it's slower than before". The problem may be that the user has a complete cache of your old site, but no cached content for the new site. So you can avoid this result by loading a content before accessing the new station. Use your browser's spare time in your old station to load images and scripts used in the new station to improve access speed.

7. Reduce the number of DOM elements

A complex page means that more data needs to be downloaded, and it also means that JavaScript is more efficient at traversing the DOM. For example, when you add an event handle, the loop effect in 500 and 5,000 DOM elements is definitely not the same.

The presence of a large number of DOM elements means that there are parts of the page that can be streamlined without having to remove the content and simply replace the element tag. Did you use the table in the page layout? Have you ever introduced more <div> elements just for the sake of layout? There may be a label that fits or is more appropriate in semantics for you to use.

YUI CSS Utilities can bring great help to your layout: Grids.css can help you with the overall layout, font.css and reset.css can help you remove the default browser format. It provides an opportunity to re-examine the labels on your page, such as using <DIV> only when it is semantically meaningful, instead of using it because it has a line-wrapping effect.

The number of DOM elements is easy to calculate, just enter in the console of the Firebug:

document.getElementsByTagName (' * '). length

So how many DOM elements are many? This can be compared to a similar page with a good tag. The Yahoo! Home page, for example, is a very much more informative one, but it uses only 700 elements (HTML tags).

8, according to the Domain name Division page content

Dividing the page content into sections allows you to maximize parallel downloads. Because of the impact of DNS lookups, you first need to make sure that the number of domain names you use is between 2 and 4. For example, you can put the HTML content and dynamic content used on the Www.52maomao.info, and the various components of the page (pictures, scripts, CSS) are stored in Statics1.52maomao.info and Statics.52maomao.info.

You can find more information in the article "maximizing Parallel Downloads in the Carpool Lane", written in Tenni Theurer and Patty Chi.

9. Minimize the number of IFRAME

The Ifrmae element can insert a new HTML document in the parent document. It is important to understand the work of the IFRAME before you can use it more effectively.

<iframe> Advantages:

1, to solve the load of slow loading of third-party content and advertising issues;

2, Security Sandbox;

3, parallel load script.

Disadvantages of <iframe>:

1, the instant content is empty, loading also takes time;

2, will prevent the page loading;

3, no semantics.

10, do not appear 404 errors

HTTP request time consumption is very large, so using an HTTP request to get a useless response (for example, 404 does not find the page) is completely unnecessary, it will only reduce the user experience without a little benefit. Some sites change the 404 error response page to "Are you looking for * * * *", which improves the user experience but also wastes server resources (such as databases, etc.). The worst case scenario is a link to an external JavaScript that has a problem and returns 404 code. First, this load destroys parallel loading, and second, the browser will try to find something useful in the returned 404 response content as JavaScript code to execute.

11. Use Content distribution network

The proximity of the user to your Web server can affect the length of the response time. Spreading your site content across multiple, geographically located servers can speed up download speeds. But what should we do first?

The first step in arranging site content by geography is not to try to re-architect your site so that they run properly on the Distributor. Depending on the needs of your application, you can change the structure of your site, which may include more complex tasks such as synchronizing session state and merging database updates between servers. These architectural steps may be unavoidable in order to shorten the distance between users and content servers.

Keep in mind that 80% to 90% of the response time in the end user's response time is used to download images, stylesheets, scripts, Flash, and other page content. This is the Golden Code of website performance. It would be a little better to distribute static content first, than to redesign your application architecture for a more difficult task. This not only shortens response times, but it is easier to implement for content distribution networks.

Content Delivery Network,cdn is made up of a series of Web servers scattered across geographically diverse locations, which improves the speed of website content transfer. The server used to transfer content to users is primarily specified based on how close the user is to the network. For example, a server with the fewest network hops (hops) and the fastest response is selected.

Some large network companies have their own CDN, but the cost of using CDN services such as Akamai Technologies,mirror Image Internet, or limelight Networks, is very high. For startups and personal sites, there may not be a cost budget for using a CDN, but as the target user base continues to grow and become more globalized, the CDN is needed to achieve a quick response. In the case of Yahoo, the static content of the website program they transferred to the CDN saved more than 20% of the end-user response time. Using a CDN is a way to significantly improve your site's access speed by simply modifying your code relatively easily.

12. Specify expires or Cache-control for the file header

This code includes two elements:

For static content: Sets the file header expiration time expires value is "never expire" (never expires);

For dynamic content: Use the appropriate Cache-control file header to help the browser make a conditional request.

Web content design is now becoming richer, which means more scripts, stylesheets, pictures, and Flash are included in the page. The user who accesses your page for the first time means to make multiple HTTP requests, but it can be cached by using the Expires file header. It avoids unnecessary HTTP requests in the next page access. The expires file header is often used for image files, but should be used in all content, including scripts, stylesheets, and Flash.

browsers (and proxies) use caching to reduce the size and number of HTTP requests to speed up page access. The Web server uses the expires file header in an HTTP response to tell the client how long the content needs to be cached. The following example is a long-time expires file header that tells the browser that the response will not expire until April 15, 2010.

Expires:thu, APR 20:00:00 GMT

If you are using an Apache server, you can use ExpiresDefault to set the expiration time relative to the current date.

The following example uses ExpiresDefault to set a file header that expires 10 years after the request time:

ExpiresDefault "Access plus ten Years"

It is important to remember that if you use the Expires file header, you must change the file name of the content when the page content changes. According to Yahoo!, we often use this step: Add the version number to the file name of the content, Yahoo_2.0.6.js.

Using the Expires file header will only work if the user has already visited your site. This is not valid for reducing the number of HTTP requests when a user first visits your site because the browser's cache is empty. So this approach will improve the performance of your site based on how often their "pre-cache" is clicked on your page ("Pre-cache" already contains all of the content on the page). Yahoo! Set up a measurement method, we found that all page views in the 75~85% have "pre-cache." By using the Expires file header, the number of cached content in the browser is increased, and the content can be reused in the user's next request, which does not even require a byte request to be sent by the user.

13, Gzip compressed file content

HTTP request and response times in network transmissions can be significantly improved through the front-end mechanism. Indeed, the bandwidth of the end user, the Internet provider, and proximity to the peer Exchange point are not what the web developer can decide. But there are other factors that affect the response time. By reducing the size of the HTTP response, you can save HTTP response time.

Starting with http/1.1, the Web client supports the compression format of the Accept-encoding file header in HTTP requests by default:

Accept-encoding:gzip, deflate

If the Web server detects the above code in the requested file header, the response content is compressed in the way the client lists. The Web server returns the compression method to the browser by content-encoding in the response file header.

Content-encoding:gzip

Gzip is currently the most popular and most effective compression method. This was developed by the GNU Project and standardized through RFC 1952来. The only other compression format is deflate, but its limited effect is slightly inferior.

Gzip will probably reduce the response size by 70%. There are currently approximately 90% Internet exchanges supported in gzip format via browser transfer. If you are using the Apache,gzip module configuration and your version is about: Apache 1.3 uses Mod_zip, while Apache 2.x uses moflate.

Browsers and proxies have the problem that browsers expect to receive and actually receive content that does not match. Fortunately, this particular situation decreases with the use of older browsers. The Apache module avoids this situation by automatically adding the appropriate vary response file header.

Depending on the file type, the server chooses which files need to be gzip compressed, but this is too restrictive for compressible files. Most Web servers compress HTML documents. Compressing scripts and style sheets is also worth doing, but many Web servers do not have this capability. In fact, compressing any response of a text type, including XML and JSON, is worthwhile. The image and PDF files can no longer be gzip compressed because they have been compressed. If you try to Gizp compress these files, you will not only waste CPU resources but also increase the size of the files.

Gzip compressing all possible file types is an easy way to reduce the file volume and increase the user experience.

For more detailed information on gzip compression, please refer to my other two articles: "gzip page Compression principle" and "gzip compression for Web performance optimization"

14. Configuring the ETag

Entity tags (etags) is a mechanism that Web servers and browsers use to determine whether content in the browser cache matches the original content in the server (the "entity" is the "content", including pictures, scripts, stylesheets, etc.). Increasing the ETag provides a more flexible mechanism for validating entities than using the Last-modified date (last Edited time). The ETag is a unique string that identifies the version number of the content. The only format restriction is that it must be enclosed in double quotation marks. The original server specifies the etag of the page content through a response that contains the ETag file header.

http/1.1 Oklast-modified:tue, Dec 2006 03:03:59 gmtetag: "10c24bc-4ab-457e1c1f" content-length:12195

Later, if the browser is validating a file, it uses the If-none-match file header to pass the ETag back to the original server. In this example, if the ETag matches, it returns a 304 status code, which saves 12195 bytes of response.

Get/i/yahoo.gif Http/1.1host:love.52maomao.infoif-modified-since:tue, Dec 2006 03:03:59 GMTIf-None-Match: " 10c24bc-4ab-457e1c1f "http/1.1 304 not Modified

The problem with ETag is that it is generated based on a unique property that identifies the server on which the site resides. The ETag does not match when the browser obtains page content from one server and authenticates to another server, which is very common for Web sites that use server groups and process requests. By default, both Apache and IIS embed data into the etag, which significantly reduces file validation collisions between multiple servers.

The ETag format in Apache 1.3 and 2.x is inode-size-timestamp, and even if a file is in the same directory on a different server, the file size, permissions, timestamps, and so on are identical, but their internal code is different on different servers.

IIS 5.0 and IIS 6.0 handle the ETag in a similar mechanism. The ETag format in IIS is Filetimestamp:changenumber. Use ChangeNumber to track changes to the IIS configuration. The changenumber between different IIS servers used by the Web site is not the same. Apache and IIS on different servers do not receive a small, fast 304 response, even if the etag generated for the exact same content is not the same, but instead they receive a normal 200 response and download the entire content.

This problem does not exist if your site is placed on only one server. But if your site is set up on multiple servers, and using Apache and IIS to generate the default ETag configuration, your users will get a relatively slow page, the server will transfer more content, take up more bandwidth, and the agent will not effectively cache your site content. Even if your content has a expires file header, the appropriate GET request will be sent whenever the user clicks the Refresh or Reload button.

If you are not using the flexible authentication mode provided by the ETag, it would be better to simply remove all the etag. Last-modified file header validation is based on the timestamp of the content. Removing the ETag file header reduces the response and the size of the file in the next request. Microsoft's support paper tells how to get rid of the etag. In Apache, you simply add the following line of code to the configuration file: Fileetag none.

15. Refresh the output buffer as soon as possible

When a user requests a page, it will take 200 to 500 milliseconds to organize the HTML file in the background anyway. During this time, the browser will remain idle for data return. In PHP, you can use the flush () method, which allows you to send a well-compiled partial HTML response file to the browser first, and the browser can download the contents of the file (script, etc.) while processing the remaining HTML pages in the background. The effect will be more noticeable in the background or when the foreground is more idle.

One of the best places to output buffer applications is immediately after

... <!–css, Js–>

<body&gt <!–content–>

16. Use get to complete AJAX requests

Yahoo! Mail team found that when using XMLHttpRequest, the Post method in the browser is a "two-step Walk" process: First send the file header before sending the data. So using get is most appropriate because it only sends a TCP packet (unless you have a lot of cookies). URLs in IE have a maximum length of 2K, so you can't use get if you want to send more than 2K of data.

One interesting difference is that post does not actually send data like get. According to the HTTP specification, get means "fetch" data, so it makes more sense to use get when you just fetch data (and semantically so), instead, send and use post to save data on the server.

17. Place the style sheet on top

When we looked at Yahoo! 's performance, we found that placing the stylesheet inside the

The problem with putting stylesheets at the bottom of the document is that in many browsers, including Internet Explorer, this aborts the ordered rendering of content. The browser aborts rendering to avoid redrawing the page elements caused by the style change, and the user has to face a blank page.

The HTML specification clearly states that the style sheet is to be included in the

18. Avoid using CSS expressions (expression)

CSS expressions are a powerful (but dangerous) way of dynamically setting CSS properties. Internet Explorer supports CSS expressions starting with the 5th version. In the following example, a CSS expression can be used to switch the background color at one-hour intervals:

Background-color:expression (New Date ()). GetHours ()%2? "#B8D4FF": "#F08A00 ″);

As shown above, JavaScript expressions are used in expression. CSS properties are set based on the results of the JavaScript expression calculations. The expression method does not work in other browsers, so it is useful to set it up separately for Internet Explorer in a cross-browser design.

The problem with expressions is that they are calculated more frequently than we think. Not only when the page is displayed and scaled, it is recalculated when the page scrolls, or even when the mouse is moved. Add a counter to the CSS expression to track how often the expression is calculated. Easily move the mouse in the page can be more than 10,000 times the amount of calculation.

One way to reduce the number of CSS expression calculations is to use a disposable expression that assigns the result to the specified style property at the first run, and replaces the CSS expression with this property. If the style attribute must change dynamically within the page cycle, using an event handle instead of a CSS expression is a viable option. If you must use CSS expressions, be sure to remember that they are counted thousands of times and may have an impact on the performance of your pages.

19. Using external JavaScript and CSS

Many performance rules are about how to handle external files. However, before you take these steps you may ask a more basic question: should JavaScript and CSS be placed in external files or placed within the page itself?

Using external files in real-world applications can improve page speed because both JavaScript and CSS files can generate caches in the browser. JavaScript and CSS built into HTML documents are re-downloaded in each request with an HTML document. This reduces the number of HTTP requests, but increases the size of the HTML document. On the other hand, if the JavaScript and CSS in the external file are cached by the browser, you can reduce the size of the HTML document without increasing the number of HTTP requests.

The key issue is that the frequency of external JavaScript and CSS file caches is related to the number of requests for HTML documents. Although there are some difficulties, there are still some indicators that can measure it. If a user in a session browses to multiple pages in your site, and the same scripts and stylesheets are reused on those pages, caching external files can be a greater benefit.

Many sites do not have the functionality to build these metrics. The best way to resolve these sites is to refer to JavaScript and CSS as external files. The exception to using built-in code is the homepage of the website, such as Yahoo! Home and my Yahoo! The home page has fewer (and perhaps only one) views in a session, and you can see that built-in JavaScript and CSS speed up response times for end users.

For home pages with larger views, there is a technology that balances the benefits of reducing HTTP requests from built-in code with caching by using external files. One of them is the built-in JavaScript and CSS on the homepage, but when the page is downloaded, the external files are downloaded dynamically, and when they are used in the sub-pages, they are cached in the browser.

20. Cut JavaScript and CSS

Streamlining means reducing the file size by removing unnecessary characters from the code to save download time. All comments, unwanted whitespace characters (spaces, line breaks, tab indents) are removed when the code is subtracted. In JavaScript, the volume of files that need to be downloaded is reduced, which saves response time. The most extensive two tools currently used in thin JavaScript are jsmin and Yui Compressor. YUI compressor can also be used to streamline CSS.

Obfuscation is another method that can be used for source code optimization. This approach is more complex and confusing than streamlining and is more prone to problems. In a survey of the top 10 U.S. websites, streamlining can also reduce the volume of the original code by 21%, while confusion can reach 25%. Although obfuscation can be a better way to reduce code, the risk of streamlining for JavaScript is even smaller.

In addition to reducing external script and stylesheet files,,<script> and <style> blocks can and should be reduced. Even if you compress scripts and stylesheets with gzip, streamlining these files can save more than 5% of your space. Because of the increased functionality and volume of JavaScript and CSS, code reduction will benefit.

21. Replace @import with <link>

The previous best implementations mentioned that CSS should be placed at the top to facilitate an orderly loading rendering. In IE, the bottom of the page @import and use <link> function is the same, so it is best not to use it.

22. Avoid using filters

The IE exclusive properties AlphaImageLoader is used to correct the translucent effect of displaying PNG images in the following versions of 7.0. The problem with this filter is that when the browser loads the image it terminates the rendering of the content and freezes the browser. In every element (not just the picture) it will operate once, increasing memory costs, so its problems are manifold.

The best way to avoid using alphaimageloader completely is to use the PNG8 format instead, which works well in IE. If you do need to use AlphaImageLoader, use the underscore _filter and make it invalid for users IE7 or later.

23. Place the script at the bottom of the page

The problem with scripting is that it blocks parallel downloads of pages. The http/1.1 specification recommends that there be no more than two concurrent downloads for each host name in the browser. If your picture is placed on multiple host names, you can download more than 2 files at a time in each parallel download. However, when the script is downloaded, the browser does not download other files at the same time, even if the host name is different.

In some cases it may not be easy to move the script to the bottom of the page. For example, if the script uses document.write to insert the page content, it cannot be moved down. There may also be a scope problem here. In many cases, this problem will be encountered.

A frequently used workaround is to use deferred scripting. The Defer property indicates that the script does not contain document.write, which tells the browser to continue displaying. Unfortunately, Firefox does not support the defer property. In Internet Explorer, scripts can be delayed but not as expected. If the script can be delayed, it can be moved to the bottom of the page, which will make your page load faster.

24. Eliminate duplicate scripts

Repeatedly referencing JavaScript files on the same page can affect the performance of the page. You might think this is a rare situation. Surveys of the top 10 U.S. websites show that two of them have duplicate reference scripts. There are two main factors that cause a script to be repeatedly referenced by a strange phenomenon: the size of the team and the number of scripts. If this is the case, repeating the script causes unnecessary HTTP requests and useless JavaScript operations, which reduces site performance.

Unnecessary HTTP requests are generated in Internet Explorer, but not in Firefox. In Internet Explorer, if a script is referenced two times and it is not cacheable, it generates two HTTP requests during the page load process. Instant scripts can be cached, and additional HTTP requests are generated when the user reloads the page.

In addition to adding additional HTTP requests, multiple operations scripts can be a waste of time. In Internet Explorer and in Firefox, regardless of whether the script is cacheable, there is a problem with repetitive arithmetic javascript.

One way to avoid occasional two references to the same script is to use the Script management module to reference the script in the template. The most common way to refer to a script using <script/> tags in an HTML page is to:

<script type= "Text/javascript" src= "Menu_1.0.17.js" ></script>

In PHP you can override this by creating a method named Insertscript:

To prevent multiple references to scripts, this method should also use other mechanisms to process scripts, such as checking the owning directory and adding a version number to the script file name for the expire file first class.

25. Reduced DOM Access

Using JavaScript to access DOM elements is slow, so in order to get more of the supposed pages, you should:

1, the cache has been visited the relevant elements;

2. After the offline update node, add them to the document tree;

3. Avoid using JavaScript to modify the layout of the page.

For more information on this, please see the article "High Performance Ajax Program" by Julien Lecomte in the Yui topic.

26. Develop Intelligent event handlers

Sometimes we feel the page is unresponsive because there are too many event handles attached to the DOM tree element and some of the event sentences are frequently triggered. That's why using the event delegation is a good way to do that. If you have 10 buttons in a div, you just need to attach an event handle to the DIV, instead of adding a handle to each button. When the event bubbles, you can capture the event and determine which event was emitted.

You also don't have to wait for the OnLoad event to operate the DOM tree. All you need to do is wait for the element you want to access in the tree structure to appear. You don't have to wait for all the images to load.

You might want to replace the Onavailable method in the event application with the Domcontentloaded event.

27. Reduce Cookie Volume

HTTP Coockie can be used for a variety of purposes, such as permission authentication and personalization. The information inside the Coockie is communicated between the Web server and the browser via the HTTP file header. It is therefore important to keep the coockie as small as possible to reduce the user's response time.

For more information, you can view Tenni Theurer and Patty Chi's article "when the Cookie crumbles". The main findings of this study include:

1, remove unnecessary coockie;

2, make Coockie size as small as possible to reduce the impact on user response;

3. Be careful to set Coockie on the domain name of the adaptive level so that the subdomain is not affected.

Setting a reasonable expiration time, expire the time earlier, and not removing the coockie prematurely will improve the user's response time.

28. Use no Coockie domain name for page content

When the browser requests a static picture and sends Coockie in the request, the server does not use any of these coockie. So they just create network transmissions because of some negative factors. All you should be sure is that requests for static content are non-Coockie requests. Create a subdomain and use it to store all of the static content.

If your domain name is www.52maomao.info, you can have static content on the Static.52maomao.info. However, if you are not on Www.52maomao.info but have set up Coockie on the top-level domain 52maomao.info, all requests for static.52maomao.info include Coockie. In this case, you can re-purchase a new domain name to have static content, and to keep the domain name is not coockie. Yahoo! uses Ymig.com,youtube to use Ytimg.com,amazon images-anazon.com and so on.

Another benefit of having static content with a Coockie domain name is that some proxies (servers) may deny caching of Coockie content requests. A related suggestion is that if you want to be sure that you should use 52maomao.info or Www.52maomao.info as your home page, you have to take into account the impact of Coockie. Ignoring the WWW will make you have no choice but to set Coockie to *.example.org (* is a pan-domain name resolution, which represents all subdomains), so it is best to use a subdomain with WWW and set Coockie on it for performance reasons.

29. Optimize the image

After the designers have finished designing the pages, don't rush to upload them to the Web server, there are a few things to do:

You can check whether the number of image colors in your GIF image matches the palette specifications. It is easy to check with the following command line in ImageMagick: Identify-verbose image.gif.

If you find that only 4 colors are used in the image, and the 256-color color slots are displayed in the palette, then this picture has a compressed space.

Try converting the GIF format to PNG format to see if you're saving space. In most cases, it can be compressed. Because of the limited browser support, designers are often reluctant to use PNG-formatted images, but this is a thing of the past. There is only one problem with the alpha channel translucency problem in the true Color PNG format, but again, GIF is not a true color format and does not support translucency. So GIF can do that, and PNG (PNG8) can do the same (except animations). The following simple command can safely convert the GIF format to PNG format: convert image.gif image.png

"What we want to say is: give PNG a chance to play!" ”

Run Pngcrush (or other PNG optimization tools) on all PNG images. For example:

Pngcrush Image.png-rem Alla-reduce-brute Result.png

Run Jpegtran on all JPEG images. This tool can do lossless operations on the appearance of aliasing in the picture, and it can also be used to optimize and erase annotations and other useless information (such as EXIF information) in the Picture: Jpegtran-copy none-optimize-perfect src.jpg dest.jpg

30. Optimize CSS Spirite

Arrange your images horizontally in the spirite, and the vertical arrangement will slightly increase the file size;

Spirite The combination of the color closer together can reduce the number of colors, the ideal condition is less than 256 colors in order to apply the PNG8 format;

Easy to move, do not leave a large gap in the middle of the spirite image. While this is unlikely to increase the file size, it requires less memory for the user agent to decompress the image into a pixel map. 100x100 's picture is 10,000 pixels, while 1000x1000 is 1 million pixels.

31. Do not scale the image in HTML

Don't use a larger picture than you actually need to set the length and width in HTML. If you need: Then your picture (mycat.jpg) should be 100x100 pixels instead of a 500x 500-megapixel picture is reduced to use.

32, Favicon.ico to be small and cacheable

Favicon.ico is a picture file located in the root directory of the server. It must exist, because even if you don't care if it's useful, the browser makes a request for it, so it's best not to return a 404 Not Found response. Because it is on the same server, it is sent once every time Coockie is requested. This image file also affects the download order, for example, in IE when you request additional files in onload, Favicon will download the additional content before it is loaded.

Therefore, in order to reduce the favicon.ico brought about the drawbacks, to do: The file as small as possible, preferably less than 1 K.

At the right time (that is, you do not want to change favicon.ico, because it cannot be renamed when the new file is replaced), set the expires file header for it. You can safely set the expires file header to the next few months. You can make a judgment by checking the last editing time of the current favicon.ico.

ImageMagick can help you create a small favicon.

33. Keep single content less than 25K

This limitation is mainly because the iphone cannot cache files larger than 25K. Note that this refers to the size after decompression. Because simple GIZP compression may not be required, it is important to streamline files.

For more information, see the files of Wayne Shea and Tenni Theurer, "performance, part 5:iphone cacheability–making it Stick".

34. Packaging components into compound text

Wrapping page content into compound text is like an email with multiple attachments, which enables you to get multiple components in an HTTP request (remember: HTTP requests are extravagant). When you use this rule, first determine whether the user agent is supported (iphone is not supported).

Yahoo website performance optimization of the 34 rule

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.