Performance Optimization Methods for Web websites)

Source: Internet
Author: User
Tags apc php development environment subdomain subdomain name website performance

Performance Optimization Methods for Web websites

I. frontend Optimization

Website performance optimization is a comprehensive topic, involving server configuration and the front and back end of the website.ProgramIn other aspects, I just share my experience with you on how to optimize your website performance. The reason why I put a web on the title is that this article focuses more on the Performance Optimization of Small and Medium websites. the system I use is also a typical lamp architecture of web.

 

First, let's talk about front-end optimization. 80% of the waiting time for users to access the Web page occurs on the browser front-end, especially the various elements (images, CSS, JavaScript, flash…) in pages and pages ...) . Therefore, in many cases, compared to spending a lot of time on hard and Complex Program Improvement, front-end optimization often plays a multiplier effect. Yahoo recently published its internal performance testing tool yslow to a third party and released 13 famous website performance optimization rules. We recommend that you download and install yslow, and used as a tool to evaluate the effectiveness of website optimization. The following describes the optimization methods that are particularly valuable:

For users who have accessed your website for the first time and have not cached your website content in the browser cache, we can do the following:

1)Reduces the number of HTTP connections for a page
For users who access your website for the first time, the number of HTTP connections generated by the page is a key bottleneck affecting performance.

Countermeasure:
-The page design should be as concise as possible to minimize the use of images and reduce the use of JavaScript by giving up unnecessary page effects.
-Some optimization techniques, such as reducing the number of images by using the background shift of images; image map technology; and bundling CSS images into webpages using Inline images.
-Merge JS and CSS files as much as possible to reduce the number of independent files.

2)Use gzip to compress webpage content
Using gzip to compress static content on a webpage can significantly reduce the waiting time for users to access the webpage (it is said that it can reach 60% ). Mainstream Web servers support or provide gzip compression. If you use an Apache server, you only need to enable mod_gzip (apache1.x) or mod_deflate (apache2.x) in the configuration file. For static pages, Gzip compression can significantly improve server efficiency and reduce bandwidth expenditures. Note that the image content is already in a compressed format and must not be compressed.

3)Place CSS at the top of the page and JS file at the bottom of the page
CSS references should be placed in the header of HTML. js file references should be placed behind the tag at the bottom of the page as much as possible. The main idea is to display the core page content as soon as possible. However, it should be noted that some pages that use a large number of JS files may be placed at the bottom of the page, which may cause some unpredictable problems and can be used as appropriate according to the actual situation.

4)Minimize JS File Content
Specifically, JavaScript compression tools are used to compress JavaScript scripts, remove blank characters, comments, and minimize variable names. Based on gzip compression, the compression of JS content can increase the performance by 5%.

5)Minimize the use of external scripts and reduce the DNS query time
Do not reference too many external scripts on the webpage. First, a DNS resolution process will consume 20 to 20 milliseconds. Second, if too many external files (such as various advertisements and alliances) are referenced on the pageCode), Your website may be slowed down due to the response speed of external files. If you have to use it, try to put these scripts in the footer. However, it should be mentioned that browsers can only process two requests under the same domain name in parallel, but different subdomain names are not restricted. Therefore, static content (CSS, JS) on this site is appropriate) placing it in another subdomain name (such as static.xxx.com) will help improve the browser's ability to download web content in parallel.

The main optimization idea for frequent access to your website is to use the cache of your browser to minimize the server overhead.

1)Add the expiration time (Expires header) to the header)
Add a long expiration time to the static content in the header, so that users can access the static content in the future and only read files in the cache without any interaction with the server. However, there are also some problems in this process. When images, CSS, and JS files are updated, users cannot obtain this update if they do not refresh the browser. In this way, we must rename images, CSS and JS files to ensure that users can access the latest content. This may cause a lot of trouble for development, because these files may be referenced by many files on the site. The solution proposed by Flickr is to use URL rewrite to point URLs of different versions to the same file. This is a clever method, because URL-level operations are very efficient, it can provide a lot of convenience for the development process.

To understand why this is done, you must understand the working mechanism of the browser when accessing the URL:
A. when accessing the URL for the first time, the user obtains the page content from the server segment and puts the relevant files (images, CSS, JS ...) In the cache, the expired time, last modified, etags, and other related information in the file header will also be kept together.
B. when a user repeatedly accesses a URL, the browser first checks whether files with the same name on the site exist in the cache. If yes, the browser checks the expiration time of the files. If not, the file is read directly from the cache and no longer accessed to the server.
C. if the expiration time of the cached file does not exist or exceeds the limit, the browser will access the server to obtain the file header information and check the last modifed and etags information, if the files in the local cache are not modified after the last access, the files in the local cache are used. If the files are modified, the latest version is obtained from the server.

In my experience, if possible, follow this principle to add an expiration time for static files, which can greatly reduce repeated access to server resources.

2)Place CSS and JS files in independent external files for Reference
Place CSS and JS files in independent files so that they are independently cached and can be directly read from the browser cache when accessing other pages. The homepage of some websites may be an exception, and the page may not be browsed by itself, but it is the first impression of the user accessing the website and the starting point for directing to other pages, it is also possible that these pages use a lot of Ajax partial refresh and technologies. At this time, you can directly write CSS and JS files in the page.

3)Remove duplicate scripts
In IE, duplicate JS scripts will cause the browser cache to be unavailable. Check your program carefully and remove the repeatedly referenced scripts.

4)Avoid redirection
Except for human redirection in the header, webpage redirection often happens inadvertently, And the redirected content will not be cached in the browser. For example, when a user accesses www.xxx.com, the server switches to www.xxx.com/through 301, and adds a forward/forward later. If the server configuration is poor, this will also bring additional burden to the server. You can avoid unnecessary redirection by configuring Apache alias or using mod_rewrite module.

There are also some, such as using the CDN distribution mechanism, avoiding CSS expressions, and avoiding using etags. Because it is not very common, I will not go into details here.

After completing the above optimization, you can try to use yslow to test the performance score of the webpage. Generally, the score can reach 70 or more.

Of course, in addition to browser front-end and static content optimization, there are also Optimization for program scripts, servers, databases, and loads. These deeper optimization methods have higher technical requirements. The second half of this article focuses on backend optimization.

Ii. backend Optimization

After finishing the WEB Website front-end optimization article, I always wanted to write back-end optimization methods. Today I finally have time to sort out my ideas.

Frontend optimization can avoid unnecessary waste of servers and bandwidth resources. However, with the increase in website traffic, frontend optimization alone cannot solve all problems, the backend software's ability to process parallel requests, program operation efficiency, hardware performance, and system scalability will become the key bottlenecks affecting website performance and stability. To optimize the system and program performance, you can start from the following aspects:

1)Configuration Optimization of software such as Apache and MySQL
Although the default settings used by software such as Apache and MySQL after installation are sufficient for your website to run, by adjusting some system parameters of MySQL and Apache, we can still pursue higher efficiency and stability. There are many professionals in this field.ArticleAnd Forum (such as: http://www.mysqlperformanceblog.com/), to grasp the need for in-depth research and practice, here will not focus on the discussion.

2)Application Environment Acceleration
Here we only use the php development environment that I use most often. Some tools and software can speed up by optimizing the PHP runtime environment, the basic principle is to pre-compile and cache PHP code without changing any code. Therefore, it is relatively simple to improve the PHP running efficiency by more than 50%. Commonly used free PHP acceleration tools include: APC (http: // pecl.php.net/package-info.php? Package = APC), turck mmcache (http://turck-mmcache.sourceforge.net), PHP accelebrator (www. php-accelerator.co.uk), and paid Zend Performance Suite (www.zend.com ).

3)Separate Static and Dynamic Content
Apache is a fully functional but relatively large Web server. Its resource usage is basically proportional to the number of processes running at the same time, which consumes a large amount of server memory, the efficiency of processing parallel tasks is also average. In some cases, we can use a lightweight Web server to host static images, style sheets, and JavaScript files, which greatly improves the processing speed of static files, it can also reduce memory usage. The Web server I use is nginx from Russia. Other options include Lighttpd and thttpd.

4)Frontend access Load Balancing Based on Reverse Proxy
When a front-end server is insufficient for user access, load balancing for Web access through the front-end server is the fastest and feasible solution. Apache mod_proxy can be used to achieve Load Balancing Based on Reverse Proxy. nginx is recommended as a proxy server, which is faster than Apache.

5)Application cache technology improves database performance, File Cache, and distributed cache
The ability of Database Access to process concurrent access is a key bottleneck for many website applications. Before you think of using the master-slave structure and multi-farm to build a server cluster, you should first make full use of the database query cache. Some database types (such as MySQL's InnoDB) have built-in cache support. In addition, you can use program methods to cache common queries through files or memory. For example, the ob_start and file read/write functions in PHP can easily implement File Cache. If you have multiple servers, memcache technology can be used to cache database queries through distributed shared memory, which is efficient and scalable. memcache technology has been tested in well-known website applications such as livejournal and craigslist.org.

6)Detects the running status of the server and finds the bottleneck affecting the performance.
There is no permanent way to optimize the system. You need to detect the running status of the server to detect performance bottlenecks and potential problems in a timely manner, because the performance of the website always depends on the short board in the bucket. You can write some scripts to check the running of Web Services, and some open-source software also provides good functions, such as monit (http: // www.tildeslash.com/monit /).

7)A good extended architecture is the foundation for stability and performance.
Some tips and tricks can help you overcome the difficulties. However, to enable the website to cope with large-scale access, you need to thoroughly plan the system architecture, fortunately, many predecessors put their architectures unselfishly
The website experience is shared with us, so that we can avoid many detours. Two inspirational articles I have recently read:
-How to optimize large-scale website performance (http://www.example.net.cn/archives/2006/03/olivejournaloio.html) from the perspective of livejournal Background Development)
-Six reconstruction of MySpace (http://enissue.com/archive/4)

Finally, I have to mention the impact of program encoding and database structure on performance, a series of bad loop statements, an unreasonable query statement, and a poorly designed data table or index table, the speed of the application program is doubled. Cultivating the global thinking ability, developing good programming habits, and understanding the database operating mechanism is the basis for improving the programming quality.

I am not an expert in website architecture. The purpose of this article is to summarize my practical experience in website construction and maintenance and share it with you. For experts, it may be a shift.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.