Site optimization-biased towards the front-end

Source: Internet
Author: User
Tags httpcontext script tag browser cache

The process of browser requests and responses

I believe you can see this figure should be able to see a bit of sentiment, the site optimization of the word is too macro to work or to step-by-step encounter problems to solve the problem on the line,

But the general idea is to know what is also an extension of knowledge, though I am bronze but I have a sense of the king,

Yahoo's excellent performance team has identified 23 military:

  1. Reduce the number of HTTP requests

Merging pictures, CSS, JS, improve the first access user wait time.
2. Using CDN
near Cache ==> Intelligent routing ==> load balancing ==>WSA Full station dynamic acceleration
3. Avoid empty src and href
when the href attribute of the link tag is empty and the SRC attribute of the script tag is empty, the browser renders the URL of the current page as their attribute value, thus loading the contents of the page as their value. Test 
4. Specify expires for the file header
make content cacheable. Avoids unnecessary HTTP requests in the next page access.
5. Compress content with gzip
compressing any response of a text type, including XML and JSON, is worthwhile. Old articles 
6. Put CSS to the top
7. Put JS on the bottom
prevents JS loading from blocking subsequent resources.
8. Avoid using CSS expressions
9. Place CSS and JS in an external file
The goal is to cache, but sometimes in order to reduce the request, it will be written directly to the page, depending on the PV and IP proportional trade-offs.
10. Weigh the number of DNS lookups
Reducing host names can save response time. At the same time, however, it is important to note that reducing the host reduces the number of concurrent downloads in the page.
IE browser can only download two files from the same domain at one time. When displaying multiple images on a single page, the speed at which IE users download pictures is affected. So Sina will engage n two level domain name to put the picture.
11. Streamline CSS and JS
12. Avoid jumps
same domain: Pay attention to avoid backslash "/" jump;
cross-domain: usingAliasorMod_rewirteEstablish a CNAME (a DNS record that holds the relationship between a domain name and a domain name)
13. Delete duplicate JS and CSS
repeated invocation of the script, in addition to adding additional HTTP requests, multiple operations can also waste time. In IE and Firefox, regardless of whether the script is cacheable, there is a problem with repeating the JavaScript.
14. Configure Etags
It is used to determine whether the elements in the browser cache are consistent with the original server. More resilient than Last-modified date, such as a file that has been modified 10 times in 1 seconds, the ETag can synthesize the inode (the number of index nodes (inode) of the file), MTime (modification time) and size to make a precise judgment, Avoid UNIX records mtime can only be accurate to the second problem. The server cluster is used, preferably after two parameters. Reduce Web application bandwidth and load with etags 
15. cacheable Ajax
"Async" does not mean "instant": Ajax does not guarantee that users will not spend time waiting for asynchronous JavaScript and XML responses.
16. Use get to complete AJAX requests
when using XMLHttpRequest, the Post method in the browser is a "two-step Walk" process: First send the file header before sending the data. Therefore, it makes more sense to get data using get.
17. Reduce the number of DOM elements
is there a more appropriate label that can be used? Life is more thanDiv+css 
18. Avoid 404
Some sites change the 404 error response page to "Are you looking for * * * *", which improves the user experience but also wastes server resources (such as databases, etc.). The worst case scenario is a link to an external JavaScript that has a problem and returns 404 code. First, this load destroys parallel loading, and second, the browser will try to find something useful in the returned 404 response content as JavaScript code to execute.
19. Reduce the size of cookies
20. Using a domain without cookies
compared to the chip CSS, Yahoo! 's static files are on the yimg.com, when the client requests the static file, reduces the repeated transmission of the Cookie to the primary domain name (yahoo.com).
21. Do not use filters
png24 in IE6 translucent kind of things, don't mess, calm cut into png8+jpg
22. Do not scale the picture in HTML
23. Reduce Favicon.ico and Cache

There are also avoiding redirects and fewer DNS queries. There's a lot of things that need to calm down and think about it.

Back end for example:

A website 400 collapsed.

Analysis:

  1. Too many files are uploaded (or too many pictures).

2. Page pressure on the page is not good enough to capitalize.

3. The pressure on the database is too high.

The first problem solution, upload too many files, this problem is the most difficult to solve, but also the simplest, because the solution is a word money, June do not see Youku potatoes such sites burn money Ah! Because it involves concurrency, for example, a highway is 100M, then your parallel magnitude we will follow the 100M calculation, (this is the most stupid) assume that everyone uploads 5M files and pictures so this site of the concurrency I can think is 100/5 = 20? That is to say this site only 20 people visit, more light is the loss of files, the general principle is the site crashes, this problem is the most difficult to solve, because files and pictures are always the biggest killer of website traffic, there is no good way to do only image server separation. File server detached, ( But this violates the principle that people only use one server, some companies look big, but the boss is not the IT department does not pay attention to not invest so much of no way.

 The second problem solution, the page pressure is too big enough, this I would like to say, I have seen a lot of programmers wrote the page has been in dealing with, Although the location of. NET has always been small and medium-sized web site, but I think it is just a small and medium-sized website can be agile development as quickly as possible to write successfully without a bug on it, let us specifically analyze the reasons:

When an HTTP request is sent from the client, it will be queue and decomposed by the Web server, and if a request contains only a static file request, such as a css,js,html file or a file that is contained in a virtual directory, IIS extracts the corresponding file directly as an HTTP Response back to the client, if that's just the case, a lot of us will be unemployed, hehe. However, for these dynamically executed files that require further processing, IIS must pass the request further to the corresponding handler, and the pending program finishes executing to get the final HTTP response returned to the client via IIS. If a request contains a static request, then the static content is returned to the client after the dynamic content has been generated and the HTML is combined. For IIS, these handlers are represented by ISAPI extension. ISAPI Extension receives the extension of the request page to IIS metadata database maintains a data table query called the ISAPI Extension Mapping, which is responsible for mapping different types of resource to the corresponding ISAPI Extension. Corresponding. The mapping of ASPX is an ASP. NET ISAPI, at this point, the ASP. NET ISAPI creates a aspnet_wp.exe worker process (if the process does not exist). When one of the local ASP. aspx requests is received, a class named Applicationmanager creates a Applicationdomain (application domain) when application is requested. Applicationdomain provides application isolation for global variables and allows each application to be photo-independent. In the application domain, an instance is created for the class named Hostingenvironment that provides access to information about the application, such as the name of the folder where the application is stored. If required, ASP. NET also compiles the top-level items in the application, including the application code in the App_Code folder. After you have created the application domain and instantiated the Hostingenvironment object, ASP. NET creates and initializes the core objects, such as HttpContext, HttpRequest, and HttpResponse. The HttpContext class contains objects that are specific to the current application request, such as HttpRequest and HttpResponse objects. The HttpRequest object containsInformation about the current request, including Cookie and browser information. The HttpResponse object contains the response sent to the client, including all rendered output and cookies.

From the above analysis we can summarize the mechanism and causes of IIS reading pages:

The first is to analyze and categorize Internet requests into static page requests and dynamic page requests, so-called static requests are HTML static pages that dynamically request us to understand temporarily as ASPX, or cshtml requests.

The second is to parse the dynamic page request and return it to the browser when the dynamic request analysis becomes a static request.

So I came up with two conclusions:

The first, we put some traffic high but the page data does not always change the page we can consider to make it static. This is also the practice of some popular websites now.

Second, we can do our best to reduce the time of dynamic request Analysis .

The third kind of database pressure solution, many of which is the programmer's own quality of the problem, or the structure is not set up well.

I guess the reason may be:

First, some people like to save files or images into a binary database, so refer to the first cause of the crash.

The second, is that some programmers he is very good at database technology, so he put all the business and logic are encapsulated into a stored procedure in the database, the background code only one transaction rollback or even no, such a business, in the background response time to receive the response will naturally error.

  

Site optimization-biased towards the front-end

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.