PHP Tutorial: Mining details Enhance site performance

Source: Internet
Author: User
Tags contains header http request php tutorial query domain domain name client

It is believed that the Internet has become an indispensable part of people's life. Ajax,flex and so on the application of rich clients makes people more "happy" experience many of the original can only be implemented in C/s functions. Google, for example, has moved the most basic office apps to the Internet. Of course, the convenience of the same time also makes the page faster and slower. I was doing front-end development, in terms of performance, according to Yahoo's survey, the background accounted for only 5%, and the front end as much as 95%, of which 88% of the things can be optimized.

The above is a life cycle diagram of a web2.0 page. The engineer spoke figuratively that it had been divided into "pregnancy, birth, graduation, marriage" four stages. If we are aware of this process rather than the simple request-response when we click on a link to a Web page, we can dig out a lot of things that can improve performance in detail. Today, I heard Taobao pony a Yahoo development team on the Web Performance Study of a lecture, feel a great harvest, want to do a share on the blog.

I believe a lot of people have heard 14 rules to optimize the performance of the website. More information is visible developer.yahoo.com

1. Reduce the number of HTTP requests as much as possible [Content]
2. Use CDN (Content Delivery Network) [Server]
3. Add Expires header (or Cache-control) [Server]
4. Gzip Components [Server]
5. Put the CSS style above the page [CSS]
6. Move the script to the bottom (including inline) [JavaScript]
7. Avoid using the Expressions in CSS [CSS]
8. Separate JavaScript and CSS into external files [JavaScript] [CSS]
9. Reduce DNS queries [Content]
10. Compress JavaScript and CSS (including inline) [JavaScript] [CSS]
11. Avoid redirection [Server]
12. Remove Duplicate Scripts [JavaScript]
13. Configure Entity tags (ETags) [CSS]
14. Make AJAX Cache

Under Firefox there is a plugin yslow, integrated in Firebug, you can use it very easy to see their own site in these several aspects of performance.

This is to use YSlow to my website westerly square test result, very regrettable, only 51 points. Oh. China's major sites are not high score, just measured, Sina and NetEase are 31 points. Then Yahoo(the United States) score is really 97 points! It can be seen in this aspect of Yahoo's efforts. From the 14 rules that they have summed up, there are now a new 20 points to see, there are many details we really do not think about, and some practices even some "abnormal".

first, reduce the number of HTTP requests as much as possible (makefewer http Requests )

HTTP requests are expensive, and finding ways to reduce the number of requests can naturally improve the speed of web pages. Commonly used methods, merging Css,js (CSS and JS files in a page are merged) as well as Image maps and CSS sprites. Of course, the Css,js file may be split into multiple because of CSS structure, shared and so on. Alibaba Chinese station at that time is the practice is still developed separately, and then in the background to the js,css to merge, so for the browser is still a request, but the development can still be restored to multiple, easy to manage and repeat references. Yahoo even recommends that the first page of CSS and JS directly written in the paging file, rather than external references. Because the number of visits to the home page is too large, this can also reduce two requests. In fact, many of the domestic portals are doing so.

The CSS sprites refers to using only the background image on the page to merge into one, and then through the CSS background-position attribute defined values to take his background. Taobao and Alibaba Chinese station are now doing so. Interested can look at Taobao and Alibaba background map .

http://www.csssprites.com/ This is a tool site that automatically merges the images you upload and gives the corresponding background-position coordinates. The results are exported in PNG and GIF format.

Second, use CDN (Content distribution Network): Use a Content Delivery network

To tell you the truth, the CDN is not very understanding of its own, simply, by adding a new layer of network architecture to the existing Internet, the content of the site is published to the nearest user's cache server, through DNS load balancing technology, To determine the user's source to access the cache server to obtain the required content, Hangzhou users visit the content of near Hangzhou server, Beijing's access to the content of the Beijing server. This can effectively reduce the data transmission time on the network, improve speed. More detailed content you can refer to Baidu Encyclopedia on the interpretation of CDN . Yahoo! to distribute static content to CDN reduces user impact time by 20% or more.

CDN Technical Sketch:



CDN Network Diagram:

article Three, add Expire/cache-control head : Add an Expires header

Now more and more pictures, scripts, Css,flash are embedded in the page, and when we visit them, we are bound to do many times the HTTP request. We can actually cache these files by setting the Expires header. Expire is actually a header message to specify the cache time of a particular type of file in the browser. Most of the pictures, flash after the release is not necessary to modify, after the cache so that the browser will not need to download these files from the server but directly from the cache read, so that the speed of access to the page will be greatly accelerated. A typical HTTP 1.1 protocol returns the header information:
http/1.1 OK
Date:fri OCT 1998 13:19:41 GMT
server:apache/1.3.3 (Unix)
cache-control:max-age=3600, Must-revalidate
Expires:fri OCT 1998 14:19:41 GMT
Last-modified:mon, June 1998 02:28:12 GMT
ETag: "3E86-410-3596FBBC"
content-length:1040
Content-type:text/html

Where Cache-control and expires can be completed by server-side scripting.

For example, set expires in PHP after 30 days:

<!--pHeader("Cache-Control: must-revalidate"); $offset = 60 * 60 * 24 * 30; $ExpStr = "Expires: " . gmdate("D, d M Y H:i:s", time() + $offset) . " GMT"; Header($ExpStr);--> 

can also be done by configuring the server itself, these I am not very clear, hehe. Want to know more friends can refer to http://www.web-caching.com/

As far as I know, at present Alibaba Chinese station's expires expiration time is 30 days. However, there have been problems, especially for the script expiration settings should be carefully considered, or the corresponding script features updated after the client may be a long time to "perceive" the change. I've had this problem before [suggest Project ]. So, which should be cached, which should not be cached or should be carefully considered.

fourth, enable gzip compression: Gzip components

The idea of gzip is to compress files on the server side first and then transfer them. This can significantly reduce the file transfer size. After the transfer is finished, the browser will decompress the compressed content and execute it. Today's browsers can support gzip "well". Not only the browser can recognize, and each big "reptile" also can identify, you seoer can put down the heart. and gzip compression ratio is very large, the general compression rate of 85%, that is, server-side 100K of the page can be compressed to 25K and then sent to the client. The specific gzip compression principle you can refer to csdn on the "gzip compression algorithm " this article. Yahoo special emphasis, all text content should be gzip compression: HTML (PHP), JS, CSS, XML, txt ... We did a good job on this site, it was a A. Before our home page is also not a, because there are a lot of advertising code on the home page JS, these ads code owner's website JS without gzip compression, will also drag our site.

The above three points mostly belong to the server side of the content, I am also superficial understanding just. The wrong place for you to correct me.

Fifth, put the CSS on the top of the page (putting stylesheets at the tops )

Put CSS on the top of the page, this is why? Because Ie,firefox and other browsers will not be able to render anything until the CSS is fully transmitted. The reason is as simple as Mago said. CSS, full name cascading style Sheets (cascading style form). Cascade that means that the back of the CSS can cover the front of the CSS, a high level of CSS can cover low-level CSS. in [CSS! Important] The bottom of this article has simply mentioned this level of relationship, where we just need to know that CSS can be overwritten. Since the front can be covered, the browser after he completely loaded and then go to the rendering is certainly reasonable in many browsers, such as IE, put the style sheet at the bottom of the page is the problem is that it prohibits the order of the page content display. The browser blocks the display from being able to redraw the page elements, and the user sees only blank pages. Firefox does not block display, but this means that some page elements may need to be repaint when the stylesheet is downloaded, which causes flicker problems. So we should get the CSS loaded as soon as possible.

Along this layer of meaning, if we further examine the words, in fact there can be optimized place. For example, this site contains two CSS files, <link rel= "stylesheet" rev= "stylesheet" href= "http://www.space007.com/themes/google/style/" Google.css "type=" text/css "media=" screen "/> and <link rel=" stylesheet "rev=" stylesheet "href=" http:// Www.space007.com/css/print.css "type=" Text/css "media=" print "/>. From media you can see that the first CSS is for browsers, and the second CSS file is for print style. From the user's behavior habits will be, to print the action of the page must have occurred in the page page displayed after. So the better method should be in the page after loading and then dynamically for this page with the CSS for the printing device, which can improve a little speed. haha

Sixth, place the script at the bottom of the page (put Scripts at the Bottom)

The goal of putting the script at the bottom of the page is two points: 1, because preventing script scripts from blocking the download of the page. In the process of page loading, when the browser read JS execution statement will always explain it all after the completion of the next read the following content. Do not believe you can write a JS dead loop to see the page below the things will not come out. (SetTimeout and setinterval perform somewhat like multithreading and continue to render the following content before the corresponding response time.) The logic of the browser is that because JS can execute location.href or any other function that might completely disrupt the page, that is, of course, it has to wait until after he finishes loading it.        So at the end of the page, you can effectively reduce the load time of the page visual elements. 2, the second problem caused by the script is that it blocks the number of concurrent downloads. The http/1.1 specification recommends that browsers have no more than 2 concurrent downloads per host (ie only 2, and other browsers such as FF are set to 2 by default, although the new IE8 can reach 6). So if you distribute the image files to more than one machine, you can achieve more than 2 concurrent downloads. However, when the script file is downloaded, the browser does not start other concurrent downloads.

Of course, for each website, the feasibility of putting the script on the bottom of the page is questionable. Just like the page of Alibaba Chinese station. Many places have inline JS, the page's display is heavily dependent on this, I admit this and the concept of no intrusive script is far away, but a lot of "historical legacy problems" is not so easy to solve.

seventh, avoid using Expressions in CSS (avoid CSS Expressions)

But this is more than two layers of meaningless nesting, certainly not good. There needs to be a better way.

Eighth, place JavaScript and CSS in an external file (make JavaScript and css External)

I think it is very easy to understand this point. This is done not only in terms of performance optimization, but also in the sense that code is easy to maintain. CSS and JS written on the page content can be reduced 2 requests, but also increased the size of the page. If you have already cached CSS and JS, there are no more than 2 extra HTTP requests. Of course, I also said in the previous, some special page developers will choose to inline CSS and JS files.

nineth, reduce DNS queries (Reduce DNS lookups)

On the Internet, the domain name and IP address is one by one corresponding, domain name (kuqin.com) is very good to remember, but the computer does not know, the computer between the "Joseph" also to turn into IP address. Each computer on the network should have a separate IP address. The conversion work between domain name and IP address is called Domain name resolution, also known as DNS query. A DNS parsing process consumes 20-120 milliseconds, and the browser does not download anything under that domain until the DNS query is finished. Therefore, reducing the time of DNS query can speed up the loading of the page. Yahoo's proposal a page contains as many domain names as possible to control 2-4. This requires a good plan for the overall page. At present, we do not do a good job, a lot of advertising system to drag us down.

article Tenth, compress JavaScript and CSS (minify JavaScript)

Compressed JS and CSS around obviously, reduce the number of page bytes. Capacity small page load speed naturally also quickly. and compression in addition to reduce the volume can also play a certain degree of protection. We have done a good job on that. Commonly used compression tools have jsmin, YUI compressor and so on. Also like http://dean.edwards.name/packer/ gives us a very convenient online compression tool. You can see in the jquery page of the compressed JS file and no compressed JS file capacity differences:

Of course, one of the drawbacks of compression is that the code is not readable. I believe that a lot of friends who do front-end have encountered this problem: look at the effect of Google is cool, but to see his source code is a lot of crowded together characters, even function names are replaced, Khan die! Your own code is not so easy to maintain. The current practice of all Alibaba Chinese stations is to compress on the server side when JS and CSS are released. This makes it easy for us to maintain our own code.

article 11th, Avoid redirection (avoid redirects)

Recently, I saw the article "Internet Explorer and Connection Limits" on IEBlog, such as when you enter http://www.webjx.com/ The server will automatically generate a 301 server steering http://www.webjx.com/ , you can see the browser's address bar. This redirection naturally takes time, too. Of course, this is only an example, there are many reasons for redirection, but the constant is that each additional redirect will add a Web request, so as to minimize.

article 12th, to remove a duplicate script (Remove Duplicate Scripts)

I do not want to say this also know, not only from the performance considerations, the code is also the norm to see this. But admittedly, there are times when we add some code that might be repetitive because of the speed of the graph. Perhaps a unified CSS framework and JS framework can better solve our problems. Yue's view is very right, not only to do not repeat, but also to be reusable.

article 13th, Configure entity tags (ETags) (Configure ETags)

I do not understand this, hehe. On the Inforq found a more detailed explanation of the "use of etags to reduce Web application bandwidth and load ," interested students can go to see.

article 14th, make AJAX Cache (Make Ajax cacheable)

Is Ajax still going to cache? When making an AJAX request, you often have to add a timestamp to avoid caching. It's important to remember the "asynchronous" does not imply "instantaneous". (It is important to remember that "asynchrony" is not "instantaneous"). Remember, even if Ajax is dynamically generated and works only for one user, they can still be cached.



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.