34 Golden rules for Yahoo! website Performance Best Experience

Source: Internet
Author: User
Tags response code website performance akamai technologies

34 Golden rules for Yahoo! website performance Best Experience

Yahoo! 's exceptional Performance team brings best practices for improving Web performance. They conducted a series of experiments, developed tools, wrote a large number of articles and blogs and participated in discussions at various conferences. The core of best practices is designed to improve site performance.
Excetional Performance Team summed up a series of ways to improve the speed of the website. Can be divided into 7 major categories of 34 articles. including content, server, cookies, CSS, JavaScript, pictures, mobile applications, such as seven parts.

The content section altogether has 10 pieces of advice:

I. Part of the content

Minimizing HTTP Requests

Reduce DNS Lookup

Avoid jumping

Cache Ajxa

Deferred loading

Load ahead

Reduce the number of DOM elements

Dividing page content with a domain name

Reduce the size of the IFRAME

Avoid 404 errors


1, minimizing the number of HTTP requests
80% of the time the end-user responds is used to download the content. This part of the time includes images, stylesheets, scripts, Flash, and so on in the download page. You can reduce the number of HTTP requests by reducing the elements in the page. This is a key step in improving the speed of your Web page.
The way to reduce page components is to simplify page design. So is there a way to keep the richness of the content of the page and to speed up response time? Here are a few techniques to reduce the number of HTTP requests and possibly keep the page content rich.

Merging Filesis to reduce the HTTP request by putting all the scripts in a file, such as simply putting all the CSS files into a stylesheet. When a script or style sheet is used on different pages, you need to make different changes, which can be a bit tricky, but even so, this is an important step in improving page performance.

CSS sprites is an effective way to reduce image requests. Put all the background images into a picture file, and then through the CSS Background-image and Background-position properties to display the different parts of the picture;

Picture Mapis to integrate multiple images into a single image. Although the overall size of the file does not change, you can reduce the number of HTTP requests. Picture maps can only be used when all parts of a picture are close together in a page, such as a navigation bar. It is not recommended to determine the coordinates of a picture and may be cumbersome and error-prone, while using picture map navigation is not readable;

Inline Imageis to load the image data into the page using the Data:url scheme method. This may increase the size of the page. Placing an inline image in a stylesheet (cacheable) can reduce HTTP requests while avoiding increasing the size of the paging file. But inline images are not currently supported by mainstream browsers.

Reducing the number of HTTP requests on a page is the first step you need to take. This is the most important way to improve the first-time access to user latency. As Tenni Theurer's Blog Browser Cahe usage-exposed!, HTTP requests take up 40% to 60% of the response time without caching. Let those who visit your site for the first time get a quicker experience.

2, reduce the number of DNS lookups
The Domain Name System (DNS) provides a correspondence between domain names and IP, just as the names in the phone book relate to their phone numbers. When you enter www.kuqin.com in the browser's address bar, the DNS resolution server returns the IP address of the domain name. The process of DNS parsing also takes time. Typically, it takes 20 to 120 milliseconds to return an IP address for a given domain name. And in this process the browser will not do anything until the DNS lookup is complete.

Caching DNS lookups can improve page performance. This caching requires a specific caching server, which typically belongs to the user's ISP or local area network control, but it also produces a cache on the computer that the user is using. DNS information remains in the operating system's DNS cache (DNS Client Service in Microsoft Windows System). Most browsers have their own caches that are independent of the operating system. Because the browser has its own cache record, it is not affected by the operating system on a single request.

Internet Explorer caches DNS lookup records by default for 30 minutes, and its key value in the registry is dnscachetimeout. Firefox's DNS Lookup record cache time is 1 minutes, and its option in the configuration file is network.dnscacheexpiration (fasterfox this option to 1 hours).

When the DNS cache in the client is empty (both the browser and the operating system are empty), the number of DNS lookups is the same as the number of host names on the page. This includes the host names included in the page, such as URLs, pictures, script files, style sheets, flash objects, and so on. Reducing the number of host names can reduce the number of DNS lookups.

Reducing the number of host names can also reduce the number of concurrent downloads in the page. Reducing the number of DNS lookups can save response time, but reducing concurrent downloads increases response time. My guiding principle is to divide the contents of these pages into at least two parts, but not more than four parts. The result is a trade-off between reducing the number of DNS lookups and maintaining a high level of concurrent downloads.

3, avoid jump
Jumps are implemented using 301 and 302 code. The following is an HTTP header with a response code of 301:
http/1.1 Moved Permanently
Location:http://example.com/newuri
Content-type:text/html
The browser points the user to the URL specified in location. All information in the header file is required in a jump, and the content part can be empty. Regardless of their name, 301 and 302 responses are not cached unless an additional header option is added, such as expires or Cache-control to specify it cache. <meat/> Elements of the refresh tag and JavaScript can also implement the URL jump, but if you have to jump, the best way is to use the standard 3XXHTTP status code, this is mainly to ensure that the "back" button can be used correctly.

But keep in mind that jumping the transfer reduces the user experience. Adding a jump between the user and the HTML document will delay the display of all elements in the page, because any files (images, flash, etc.) will not be downloaded until the HTML file is loaded.

There is a jump phenomenon that is often neglected by web developers and often wastes response time. This occurs when the URL is supposed to have a slash (/) and is ignored.  For example, when we want to access http://astrology.yahoo.com/astrology, we actually return a jump that contains 301 code, which points to the http://astrology.yahoo.com/astrology/ (note the trailing slash). You can use the alias or mod_rewrite or the Directoryslash to avoid the Apache server.

Connecting to new sites and old sites is another case where the jump function is often used. In this case, it is often necessary to connect the different content of the site and then jump according to the different types of users (such as browser type, user account type). The use of jumps to achieve the two Web site switching is very simple, the amount of code required is not much. While using this approach can reduce complexity for developers, it also lowers the user experience. An alternative is to use alias and Mod_rewrite and implementations when both are on the same server. If you are using a jump because of a different domain name, you can replace it by using alias or Mod_rewirte to create a CNAME (a DNS record that holds the relationship between a domain name and another domain name).

4, cacheable Ajax
One of the benefits that Ajax often mentions is the immediacy of the feedback it brings to the user because of the asynchronous nature of the information it transmits from the backend server. However, using AJAX does not guarantee that users will not spend time waiting for asynchronous JavaScript and XML responses. In many applications, whether a user needs to wait for a response depends on how Ajax is used. For example, in a web-based email client, users must wait for Ajax to return mail query results that match their criteria. Remember that "asynchronous" does not smell "instant", which is important.

In order to improve performance, it is important to optimize the AJAX response. The most important way to improve the performance of AJXA is to make the response cacheable, and the discussion can see the add an Expires or a cache-control Header. Several other rules apply to Ajax as well:
GIZP Compressed Files
Reduce the number of DNS lookups
Streamline JavaScript
Avoid jumping
Configure Etags

Let's take a look at an example: a Web2.0 email client uses Ajax to download the user Address Book from the dynamic. If a user does not make any changes to the Address book after the last time the email Web application was used, and the AJAX response is cached by expire or Cacke-control headers, it can be read directly from the previous cache. You must tell the browser whether to use the address Book in the cache or to send a new request. This can be done by adding a timestamp containing the last edit time to the Ajax URL that reads the address book, such as &t=11900241612. If the address book has not been edited since the last download, the timestamp does not change, then loading from the browser's cache reduces the HTTP request process. If the user modifies the Address book, the timestamp is used to determine that the new URL and the cached response do not match, and the browser will make an important request to update it.
Even if your ajxa response is dynamically generated, even if it applies only to one user, it should also be cached. Doing so will make your Web2.0 application more efficient.

5, postpone loading content
You can take a closer look at your Web page and ask yourself "which content is required to load first when the page is rendered." What content and structure can be loaded later.
JavaScript is an ideal choice for dividing the whole process into two parts according to the OnLoad event. For example, if you have JavaScript for implementing drag-and-drop and animation, it waits to load later because the drag-and-drop element on the page occurs after the initialization rendering. Other items, such as the contents of the hidden part (what is displayed after the user's operation) and the image in the collapsed section can also be deferred
Tools can save your workload: Yui image Loader can help you postpone loading the folding part of the picture, Yui get utility is a convenient way to include JS and CSS. For example, you can open the Firebug net tab and look at Yahoo's homepage.
When performance goals are consistent with other web development practices, they complement each other. In this case, the way to improve the performance of the site through the program tells us that in the case of JavaScript support, you can first remove the user experience, but it is to ensure that your site in the absence of JavaScript can also work correctly. After you make sure that the page is working, reload the script to achieve more fancy effects such as drag-and-drop and animation.

6, pre-loading
Pre-loading and post-loading may seem the opposite, but in fact preload is intended to achieve another goal. Pre-loading is the page content (such as images, style sheets, and scripts) that may be used in the future when the browser is idle. With this approach, when users want to access the next page, most of the content in the page has been loaded into the cache, which can greatly improve access speed.

Several preload methods are provided below:
Load unconditionally:When the onload event is triggered, the extra page content is loaded directly. Take google.com For example, you can look at how its spirit image is loaded in onload. This spirit image is not needed in the Google.com home page, but it can be used in the search results page.
Conditional loading:According to the user's action to have a basis to judge the user may go to the page below and the corresponding preload page content. In search.yahoo.com you can see how to load additional page content as you enter content.
have the expected load:Preload is used when a redesigned page is loaded. This often happens when the page is redesigned and the user complains that "the new page looks cool, but it's slower than before." The problem may be that the user has a full cache of your old site and no cached content for the new site. So you can load up a piece of content before you visit the new station to avoid this result. Use the browser's free time in your old station to load the images and scripts used in the new station to improve the speed of your visit.

7, reducing the number of DOM elements
A complex page means downloading more data, while also meaning that JavaScript is more efficient at traversing the DOM. For example, when you add an event handle, the loop effect between 500 and 5,000 DOM elements is definitely different.
The presence of a large number of DOM elements means that there are parts of the page that can be streamlined without removing the content and simply replacing the element tags. Do you use tables in the layout of the page? Have you ever introduced more <div> elements just for the sake of layout? There may be a label that fits or is more appropriate in semantics for you to use.
YUI CSS Utilities can give you the layout of a great help: Grids.css can help you achieve the overall layout, font.css and reset.css can help you remove the browser default format. It provides an opportunity to re-examine the label on your page, such as using &LT;DIV&GT only if it is semantically meaningful, not because it has a newline effect.
The number of DOM elements can easily be calculated by simply typing in the console of the Firebug:
document.getElementsByTagName (' * '). length
So how many DOM elements are there? This can be compared to a similar page with a good mark. For example, the Yahoo! homepage is a page with a lot of content, but it uses only 700 elements (HTML tags).

8, dividing the page content according to the domain name
Dividing the page into sections allows you to maximize parallel downloads. Because of the impact of DNS lookups you first have to make sure that you use between 2 and 4 domain names. For example, you can put the HTML content and dynamic content used on the www.example.org, and the various components of the page (pictures, scripts, CSS) stored on the statics1.example.org and statics.example.org respectively.
You can find more information in the articles maximizing Parallel Downloads in the Carpool lane of Tenni Theurer and Patty Chi.

9and minimize the number of IFRAME
The Ifrmae element can insert a new HTML document into the parent document. It is important to understand the work of the IFRAME before you can use it more effectively.
<iframe> Advantages:

Address loading problems with slow loading of third party content such as icons and ads

Security sandbox

Parallel load Script

<iframe> 's Disadvantages:

Instant content is empty, loading also takes time

Prevents the page from loading

No semantics.


Ten , do not appear 404 errors
HTTP request time consumption is very large, so using HTTP requests to obtain a useless response (for example, 404 did not find a page) is completely unnecessary, it will only reduce the user experience without a little benefit.
Some sites change the 404 error response page to "You are not looking for * *", which improves the user experience but also wastes server resources (such as databases, etc.). The worst case scenario is a link to an external JavaScript problem and return 404 code. First, this load destroys parallel loading, and then the browser performs the attempt to find potentially useful portions of the returned 404 response content as JavaScript code.

In the first section of this series, we talked about the 10 principles that are relevant to the "content" of the Web site's performance. In addition to the content improvements in the Web site, there are also areas on the Web server that need attention and improvement, including:

Using Content distribution networks

Specify expires or Cache-control for file headers

Gzip Compressed file contents

Configure ETag

Refresh output buffers as early as possible

Using get to complete an AJAX request


One
, using Content distribution network
The proximity of the user to your Web server can affect the length of response time. Spread your site content across multiple servers in different geographic locations to speed up downloads. But first we should do something about it.
The first step in arranging your site's content by geography is not to try to rearrange your site so that they run properly at the Distributor. Depending on the requirements of your application, you can change the structure of your Web site, which may include more complex tasks, such as synchronizing session status between servers and merging database updates. These architectural steps may be unavoidable in order to shorten the distance between users and content servers.
Keep in mind that 80% to 90% of response times in end-user response times are used to download page content such as images, stylesheets, scripts, Flash, and so on. This is the website performance Golden rule. It's much better to distribute static content first than to redesign your application architecture as a more difficult task. This not only shortens response time, but it is easier to implement for content distribution networks.
Content distribution Networks (Delivery NETWORK,CDN) are made up of a series of Web servers dispersed across geographically diverse locations, which increase the speed at which Web content is delivered. The server that is used to transfer content to the user is primarily specified based on how close the user is to the network. For example, a server with the least number of network hops (network hops) and the fastest response will be selected.
Some large web companies have their own CDN, but the cost of using CDN services like Akamai technologies,mirror image Internet, or limelight Networks is very high. For start-ups and personal sites, it may not be possible to use CDN's cost budget, but as the target user base expands and becomes more globalized, CDN is required to achieve fast response. In the case of Yahoo, the static content of the website program they transferred to the CDN saved more than 20% of the end-user response time. Using CDN is a way to dramatically improve the speed of Web site access by simply modifying the code to achieve a relatively simple change.

of, specify expires or Cache-control for file headers
The code includes two elements:
For static content: Set file header expiration expires value is "Never expire" (never expires)
For dynamic content: Use the appropriate Cache-control file headers to help the browser make conditional requests
Web page content design is now more and more rich, which means that the page to include more scripts, style sheets, pictures and flash. The first time you visit a user on your page means you make multiple HTTP requests, but by using the expires file header you can make this content cacheable. It avoids unnecessary HTTP requests in the next page access. The expires file header is often used for image files, but it should be used for all content, including scripts, stylesheets, and Flash.
browsers (and proxies) use caching to reduce the size and number of HTTP requests to speed up page access. The Web server uses the expires file header in the HTTP response to tell the client how long the content needs to be cached. The following example is a long expires file header that tells the browser that the response will not expire until April 15, 2010.
Expires:thu, APR 20:00:00 GMT
If you are using the Apache server, you can use ExpiresDefault to set the expiration time of the relative current date. The following example uses ExpiresDefault to set the file headers that expire 10 years after the request time:
ExpiresDefault "Access plus years"
Remember that if you use the Expires file header, you must change the file name of the content when the content of the page changes. According to Yahoo!, we often use this procedure: Add a version number to the file name of the content, such as Yahoo_2.0.6.js.
Using the Expires file header will only work if the user has already visited your site. This is not valid for reducing the number of HTTP requests when a user first visits your site, because the browser's cache is empty. So the way you can improve your site's performance depends on how often you click on the page when they "cache" ("cache" already contains all the content in the page). Yahoo! has established a set of measurement methods, and we find that all 75~85% in page views are "pre cached". By using the Expires file header, the cache is added in the browser

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.