Xi ' An Yi Cong website is my corporate website, the company is mainly engaged in software development, website construction business. So from the site's keywords and business positioning analysis, we put the site's main keyword as "Xi ' an website construction, Xi ' an website, Xi ' an software company", auxiliary keywords positioning for "Xi ' an construction site price, Xi ' an construction website company, Xi ' an software development company" and several key words.
The website's program is our company development, the backstage design fusion and is advantageous to the SEO each detail. such as: The whole station to generate static Web pages, each page title and description keywords different; Web site background automatically generated http://www.aliyun.com/zixun/aggregation/12334.html "> sitemap" and submitted to each search engine; automatically generate RSS feeds and submit to Baidu Blog search, Google blog search more rss aggregation source, automatically generated Google news sources and submitted to Google, the content of the site is a fixed number of daily fixed amount of original or pseudo original and regularly updated (usually at 9 o'clock in the morning after work to send 8-10); Automatically generate robots.txt, Web site leased independent server to ensure bandwidth and speed, the server hardware firewall to ensure that security is not attacked, in short, from the technical aspects of the site to do all the groundwork for optimization.
The next one months, the site's relevant keywords ranked in Baidu rankings by leaps and bounds, the main and auxiliary keywords are basically ranked home page, the content of the site also submitted a Google information source is also included, sent up the information Google 5 minutes after the basic included. The wind and rain change when they are secretly pleased with their little achievements. From the August 27 big update began, my site keyword rankings plummeted, the site is also not normal.
Modern browsers IE6 and Firefox support the client gzip, that is, the Web page on the server, before transmission, the first use of gzip compression and then transmitted to the client, after the client received by the browser to extract the display, which takes up a little bit of server and client CPU, But in exchange for higher bandwidth utilization. For plain text, the compression rate is considerable. If each user saves 50% of the bandwidth, the amount of bandwidth you rent can serve one more customer.
IIS6 has built-in gzip compression support, unfortunately, there is no better management interface settings. So it takes some work to open this option.
Enable the Gzip compression feature of IIS
First, if you need to compress the static file (HTML), you need to build a directory on your hard disk and give it the write permission to "IUSR_ machine name". If the compressed dynamic file (php,asp,aspx) does not need to, because its page is dynamically generated every time, compressed to give up. Then in IIS Manager, the right key to the Web site-Properties, not one of the following sites, but the entire Web site. Go to the Services tab, select Enable dynamic content compression, static content compression.
Then select the server extension below the Web site to create a new server extension. Name doesn't matter, the following path to the Add file is:
C:\windows\system32\inetsrv\gzip.dll, and then enable this extension.
Static content can be compressed at this time, but for dynamic content, the ASPX file is not in the compression range. Because the default compressible file does not have this extension. In the admin interface you can't find a place to add an extension, and you can only modify its configuration file.
Under C:\windows\system32\inetsrv\ There is a MetaBase.xml file, you can open it with Notepad, find the IIsCompressionScheme, there are three sections of the same name, respectively, Deflate,gzip, Parameters, the third paragraph does not care about it, the first two paragraphs have the basic same parameters, in these two paragraphs of the parameters hcscriptfileextensions under a row of ASPX, if you have other dynamic program to compress, also add here. HcDynamicCompressionLevel changed to 9, (0-10,9 is the most cost-effective one).
Then you need to restart the IIS service, you can realize the compression speed. However, it is often not obvious that you can use the Http://www.port80software.com/tools/compresscheck.asp page to check if your page has been compressed, as well as the compression ratio and speed. The whole process of CPU usage is basically not felt.
Analysis of Web site IIS records found that my home page in the spider when the status is 304 0 0, which indicates that the spider has not found updates to the Web pages. After analysis may be a problem with the IIS cache, seek solutions everywhere, and finally add a Cache-control:no-cache to the HTTP header of IIS.
Postmortem analysis found that after doing all of the static page All good, the spider return code is 200 0 0 indicates everything is OK. But all the dynamic pages (suffix is asp,php) of those page return code are all 200 0 64, indicating that still not normal. Now that you have recently enabled gzip compression to have such a consequence, set to not enable compressed application files (that is, do not start compression dynamic page), observe a few days, OK. Everything's fine. Repeated such experiments several times, finally understood.
Here to tell you, do not easily start the dynamic page gzip compression, or Baidu crawl may cause the site is down right or by K. I have experienced experiments with my own hands and hope that we can take warning.