Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
1. Definition of dead chain: Simply speaking, dead chain refers to the original normal, later failure of the link. Dead link when the request is sent, the server returns a 404 error page. such as: www.wsmghl.com/123.shtml. But for us seoer, everything to Baidu as the standard, such as the above page if not be Baidu found, included will not be counted as dead link Baidu. This will not affect the SEO.
2. Death Chain: Web site revision, Baidu before the crawl link will become a dead link, search engine crawled, but did not exist, it produced a dead link. There are also some Web links are careless write the wrong address will also produce a dead link. Some bogus address, but Baidu Spider often crawling, so will be judged as dead link
3. Death chain harm: Dead link to the site's keyword rankings or optimize what impact? What is the harm to the user loyalty of the website? A part of Seoer feel, is not a link, which tube got so much, I am busy dying every day, no time tube This what dead link, also very proud to say, my website ranking very well, As everyone knows, the death of the link to the site is how much harm. It's like a piece of rotten meat. Full pot soup, SEO has always stressed that the user experience is very important, users like your station, still afraid of spiders do not come? Just like careful men are always favored by girls, and careless men please girls like it? Of course do not send your own advantages, such as what the official second generation, the rich second-generation and so on, This is equivalent to the quality of your site outside the chain and content, but the details can be more grasp the heart. Simply put, the dead chain will reduce the weight of the site, reduce the user experience degree.
4. How to avoid dead links? Reduce the danger of dead links to websites: Dead links are unavoidable, encounter a site dead link, the solution is to delete dead links or repair dead links, so that the structure of the site more smoothly, but if the deletion of dead links, Baidu included dead link page still exists, or will identify the site exists dead link, will still cause harm to the website. To let Baidu give up dead link to crawl to use robots.txt to screen Baidu Spider, such as: Www.wsmghl.com/123.shtml for dead link.
Written in Robots.txt:
User-agent:baidusaider
Disallow:/123.shtml
So Baidu spiders will not crawl the page.
We have to constantly check whether there are dead links generated: You can take advantage of the IIS log (the space provider to produce, one per hour, if the space business even the IIS log can not provide, the proposed change space immediately) to make a judgment, in the IIS log HTTP status code generated 404, it must be a dead link, ( 4,041 must be aimed at the search engine if the HTTP status code exists 404, must immediately use robots.txt to screen it off, so that the search engine does not crawl. Dead link screen will take effect in 48 hours, delete search engine record out of 4 weeks. Follow the above methods to minimize the risk of dead links.
Summary: Dead link certainly has the influence, appears the dead link must not panic, calmly according to the above method must be able to let the website leave dead link. I wish you all have a good site rankings! Happy New Year!
Article from www.wsmghl.com Reprint please keep the source