Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Yesterday, with the IIS log analysis, it is worth the pleasure of Baidu, Google, Yahoo spiders to climb, it seems to optimize the success of the previous page did not climb to do some outside the chain after the guidance of a smooth Google crawl and included, but also found some problems, Google's spider crawl records have a lot of 404 pages returned records, this is not a good thing ah, I did not clean up the code, a large number of dead chain. Then I landed Google with a Web site management tool analysis, days ah, 210 dead chain, estimated to Google my page quality is not good, but so many 404 pages I looked up all struggled, not to mention change, this time then thought of the robots.txt.
Because my 404 pages are basically ASP end, for such a large 404 page we can set up:
User-agent:googlebot
disallow:/*.asp$
Come here this morning. A log of Google crawling last night, and really no longer care about these ASP end of the page.
If the number of dead chains that appear is not a regular page, is not suitable for robots.txt, so there is another way is to manually set 404 pages, the general host provider should be provided in the background to provide 404 pages of the operation, if it is. NET to do the program, you can set the error page in the web.config, I was directly logged on to the server to modify IIS processing 404 Code return page, always a Sentence, change the 404 page to help guide customers to jump to other useful pages to catch customers.
This article by the Saliva Fish Web page Tutorial Network (http://www.koushuiyu.cn) release, reprint please specify, thank you!