Details of the success of the site dead link detection and processing methods

Source: Internet
Author: User
Keywords Impact dead chain processing side

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Site optimization, dead link is the impact of the site image of a major key point. If not timely processing, then the site in search of the image will be greatly reduced, and over time will affect the ranking of factors such as weight. So, we should be good to deal with these dead links, following the author to see the detection of dead link method and processing methods.

How do I check if a Web site has a dead link?

Perhaps a lot of webmaster for dead link This concept is not very understanding, because the general webmaster uses most of the programs are downloaded from the Internet, and these programs themselves will have some of the content of the article, if we do not remove him thoroughly, naturally in the operation will leave some open or no content of the page, These are called Dead links or invalid links. How do we detect which links are dead links? There are several better ways.

1. Use webmaster tools to inquire. Webmaster Home tools should be used, which we use this tool to query their own web site is dead link. Figure:

  

2, manually check the site's homepage link. General dead links exist in the home page to optimize the impact of the site is the largest, naturally we can put the first page of all the links are tested to see whether the normal open, and the tool query is also the death of the query home link.

3, view the Web site IIS log. This is the most thorough check dead link method, the log returned is 200 is the normal link, returned is 404 is unable to find the file is dead link, return 304 page without any update changes. Then we analyze the IIS log to see which pages return the data is 404, naturally these pages are not access to the spider, and has been treated as a dead link.

How do I handle dead links in a Web site?

1. Use ROBOTS.txt shielding. When you set up ROBOTS.TXT, you first query which links are dead links, and then add the special characters that are included in these links to ROBOTS.TXT, such as http://www.***.com/boe/bsoe/?id=1. HTML with such a link is not normal to open, then we can naturally in the ROBOTS.TXT/?id=1.html this page address to screen out, tell the spider this is not able to crawl. Of course, the use of ROBOTS.TXT only a small range of shielding, only suitable for some of the pages of the personal site, for the rich and the link is not the law will not work.

2. Use 301 redirection settings. ROBOTS. TXT is only suitable for small web sites, and for medium-sized sites to use ROBOT.TXT can not be completely blocked off, because some links are not open and not the law. This time we need to use 301 redirects, not open the page all redirected to the new page, the use of 301 redirect project A bit large, for just the first contact line station novice a bit ambiguous, but 301 redirect is more suitable for medium-sized site use, because 301 redirect will not be the weight of the page to lose, And the search will be based on the 301 redirected page to the original weight to the new page.

3, use 404 pages to guide the dead link two visits. For large content-rich Web sites, if you can not allow users to access the invalid link when the good guidance of its operation, will certainly make the site too high jump rate, and the dead link is a large site optimization details, processing can let the user jump rate lower. In fact, on large content-rich sites using 404 pages to guide the dead link two visits is the most effective, and 404 pages can be a wide range of use, do not worry about the link is not a rule and can not be blocked, but also through the new 404 page to let the user, your webmaster home of the expert-type, Lucius small game-type , these are very good 404 pages. such as webmaster home 404 Settings, the author favorite.

For site optimization, timely the site's invalid links to dispose of, can make the site optimization effect better, but also make the site to search more friendly, but also can make spiders crawl more smoothly, crawl more fully. If the site has a dead chain to take more than timely processing, then the site snapshot ranking weight will slowly decline, after all, search is not like a dead chain to take over the site, because more dead links and less directly affect the site in search of the "image" is good. Therefore, processing a good dead link is a site optimization must do a lesson. This article by www.taob178.com Amoy plastic sell exclusive feeds, A5 start, reprint please specify, thank you!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.