Web Front-end Optimization-new series (1)

Source: Internet
Author: User
Tags set cookie

Web Front-end Optimization-new series (1)

Source: http://www.cnblogs.com/_popc

In other words:

For blog posts on web Front-end optimization, many blogs on the blog are described in detail and accurate. The author intends to write this blog, which is a summary of his one-year work and has accumulated some knowledge copied from other places, but has reviewed it.

Introduction:

1. Slow pages may cause the website to lose more users.
2. Slow 20% Ms means of users will give up accessing (google)
3. 1% Ms means that of users will abandon transactions (amazon)

Through the three data listed above, we can see the importance of web Front-end optimization. As a programmer, there is always a strong passion, hoping that what we develop will be more perfect. :)

1. Reduce Http requests

In general, we divide the data into two types in terms of variability: Change and unchanged. it is common sense that the unchanged data can be cached and the changed data cannot be cached. In other words, to reduce the number of http requests, we can convert the data into two parts: change and no change. the unchanged data does not need to be requested again, so the number of http requests is reduced. Next we will describe the way to classify the data.

1. Merge files

Including scripts, style files, and images. You can choose to combine some Js and css into one file, and some images can use css sprites technology. What is the reason for this? People who have done web development know that js and css are basically unchanged, static files, and images are also similar. So what will happen if the unchanged files are properly merged? The number of requests has changed from multiple to one, reducing the number of http requests. In fact, JavaScript merge is particularly important. We know that the loading of js by browsers is blocking. What do you mean? Most browsers support parallel loading for css and images, because when the browser loads the DOM tree, it will detect that a node is reserved before loading the following, javascript has the possibility of creating DOM nodes, so loading JavaScript has a parsing process.

2. Specify Expires or Cache-Control

For static content: Set the file header expiration time to "Never expire" (Never expire)

On the dynamic page, add cache-control to the Code to indicate the expiration time, for example:
Response. setHeader ("Cache-Control", "max-age = 3600 ");
If the Expires file header is used, the file name of the content must be changed when the page content changes. Usually the version number is added after the File Content
This is ignored by most people. Many people have released their own small systems on the jar, as well as demos and ahuaxuan, javaScript and css are not properly merged and no expiration time is set. every time you refresh the page, you need to re-download this pile of js and css. there are many http requests. no traffic is needed.
This also occurs in enterprise application systems. for example, we use extjs as the front-end technology. It is boring to import and download this js every time we open a page. so the children may have to ask why static files do not need apache, lighttpd, etc. A: How about static files? download files without expire or max-age, the best way is to write a filter and then judge in the filter. If the url meets certain conditions (such as the regular expression in the configuration file), set a max-age, in this way, OK is too simple, and several lines of code can be done. kuaizai.

3. cache Ajax requests

The cache method is the same as that for dynamic pages. ajax requests must use the get method. The url length is limited to 2 k (ie) (the post request has two processes: 1 sending the request headers, 2 sending the request data, according to http specifications, get requests only send one tcp packet ). -------- this passage comes from yahoo. No matter whether it is true or false, let's take another consideration of why we 'd better use the get method to talk about What ahuaxuan has experienced, previously, a project's ajax request used the post method, and later found frequent errors, and threw out the squid error, because our website uses squid, the problem is here, as we can understand from the http protocol, method = post refers to the process of submitting data to the server, so one feature of squid is that it does not cache post requests (in fact, it really shouldn't be cached, because this will violate the semantics in the http protocol). After the ajax request is changed to the get method, everything will be restored as usual.

4. Remove duplicate js

Duplicate js imports may also cause ie to reload the script.

5. Avoid redirection

There is a jump that is often ignored by web developers but is often a waste of response time. This occurs when the URL should have a slash (/) but is ignored. At this time, a 301 status code is returned, and then the browser initiates a new request. in enterprise applications, redirection is a commonly used technology in enterprise applications, but you must be careful when using it on website projects, the normal redirection is because the server sets http status = 302 in the response header. After the browser receives the request, it determines that it is 302 and sends a new request, the target address is the address specified in the previous response. if you do not need to redirect your website project, do not use it. if you are working on an enterprise application project, OK does not matter much, you can rest assured.
To reduce the number of http requests, you can add some instances after each small point. You can determine whether your project can be optimized based on these points.
Use cdn
Let's make the content closer to the user. The principle is very simple. It is to determine which servers are closest to the user based on the ip address of the machine where the user's browser is located, the browser will request these nearest machines again. generally, cdn service providers develop their own dns servers to achieve this goal. however, this is usually the case. Companies with high technical strength or special scenarios will develop their own cdn. of course, in any case, using cdn can make the page response faster (including audio, video, image, text files, and so on)


2. Reduce the volume of returned data
1. Use gzip to compress the returned data

Gzip compression of all possible file types is a simple method to reduce the file volume and increase user experience. For example, if the original 100 k file is compressed and only 50 k-k after compression, the network traffic is immediately reduced. The cost of compression is that the server needs to compress the file, which consumes the cpu, the browser needs to decompress the file and consume the cpu. However, for a pc with such nb in the modern era, it is not worth mentioning the cpu consumption caused by decompression of data by the browser. so you can press it. however, be careful when you press the button. Some browsers may go out with some bugs in specific scenarios, leading to abnormal pages. for example, ie6 may be a little troublesome during cross-domain processing. You can simply remove the gzip of this part of data.

2. Minimize js and css files

You can use JSMin or YUICompressor to compress js files. The latter can also compress css files. This is nothing to say. Just do it. For details about YUI Compressor, I have tried it. It's good.


3. Separate css and js into external files

In fact, this can also be seen as a distinction between unchanged data and changed data. many people like to write a lot of js and css on the page. The data is actually unchanged, that is, the data can also be cached in the browser, these data can be cached by independently converting them into external files. this seems to increase the number of requests, but because the data has been cached after the first request, the second request does not need to be sent to the backend, reducing the network bandwidth overhead.

3. Optimized Cookie1. reduced cookie size

Cookies are used for identity authentication, especially personalization. They are exchanged in the http request header. The larger the size, the slower the response;
Each 3000-byte cookie increases the response time by 80 ms in the DSL bandwidth, using Short file names and minimizing the cookie size can help improve response time.

2. properly set Cookie Domains

Because a second-level domain name can obtain a cookie for a second-level domain name, if the second-level domain name does not share the cookie with each other, therefore, setting a domain name for a cookie can avoid unnecessary bandwidth waste and increase response speed.

3. Set a reasonable cookie expiration time

Don't keep unnecessary data on your body.

4. Use domain Separation

Use a subdomain for images or other static resource files or create a new independent domain name (apply for a new domain name) to avoid unnecessary cookie transmission, image websites must be necessary. The pictures on javaeye do not use domain separation. Therefore, we actually need cookies to bring them to the image server of the forum. This is true for every request to the images (but it's okay, there are no images in the jar, so it is a waste of resources ).
Summary: in fact, the cookie issue does not seem to be a big problem for a single request. It seems that it doesn't matter. There are dozens of bytes, the story of a wire saw broken. so what we should do is to do what we mean: do not be good or small, do not be evil or small.

4. Optimize browser loading 1. Place css on the top of the page to load

The problem with placing a style sheet at the bottom of the document is that in many browsers including Internet Explorer, this will abort the orderly presentation of the content. The browser suspends rendering to avoid page element re-painting caused by style changes. The user has to face a blank page.
The HTML specification clearly states that the style sheet should be placed in the 2. Load the js file at the bottom of the page

The problem caused by the script is that it blocks parallel download of pages. We recommend that you download up to two concurrent content for each host name in the browser. If your image is placed on multiple host names, you can download more than two files simultaneously in each parallel download. However, when the script is downloaded, the browser will not download other files at the same time, even if the host name is different.
Loading Js at the bottom does not affect the Browser display page, unless the user calls a js method before the js loading is complete. For example, the page is just half displayed, however, some of these are called js files that have not yet been downloaded. In this case, problems may occur. In this case, we can load these js files first.

3. cache JavaScript and CSS files in the browser.

If possible, cache these CSS and js files in the local browser, because these can be considered static files.

 

5. Optimized js writing 1. String concatenation (using join & push)

The following code is usually used when js outputs an html segment in many places:

Var _ html = ""; _ html = "<div style = 'padding-bottom: 5px '> "; _ html + = "<span class = 'icon _ favorite 'style = 'padding-top: 2px; padding-bottom: 2px '> This is printed by js </span> "; document. write (_ html );

Let's take a look at how to write with join:

Var _ html = []; var I = 0; _ html [I ++] = "<div style = 'padding-bottom: 5px '> "; _ html [I ++] = "<span class = 'icon _ favorite 'style = 'padding-top: 2px; padding-bottom: 2px '> This is printed by js </span> "; document. write (_ html. join (""));

Or use push:

Var _ html = [];

_ Html. push ("<div style = 'padding-bottom: 5px '>"); _ html. push ("<span class = 'icon _ favorite 'style = 'padding-top: 2px; padding-bottom: 2px'> This is printed by js </span> "); document. write (_ html. join (""));

As mentioned in the web performance guide, this method is more efficient.

 

2. for Loop

Inefficiency:

// Function GetDivNum () {var divs = document. getElementsByTagName ("div"); var start = new Date (). getTime (); for (var I = 0; I <divs. length; I ++) {// "Low Efficiency"} var end = new Date (). getTime (); alert ("time used:" + (end-start) + "millisecond ");}

High efficiency:

// Function GetDivLen () {var divs = document. getElementsByTagName ("div"); var start = new Date (). getTime (); for (var I = 0, len = divs. length; I <len; I ++) {// "High Efficiency"} var end = new Date (). getTime (); alert ("time used:" + (end-start) + "millisecond ");}

Cause:

This is mainly because the for loop is being executed. In the first case, the length is calculated every time, while in the second case, the length is calculated at the beginning and saved to a variable, therefore, the execution efficiency is high, so when we use the for loop, especially when we need to calculate the length, we should start to save it to a variable. But not as long as we get the length, there will be such a significant difference. If we only operate on an array and get the length of an array, then the two writing methods are almost the same. I will not test it here.

OK. Here we will summarize some operations on the front-end optimization and the background. We will introduce how to merge js and compress the output.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.