Site optimization, dead link is the impact of the site image of a major key point. If not timely processing, then the site in search of the image will be greatly reduced, and over time will also affect the ranking of factors such as weight. So, we should be good to deal with these dead links, following the author to see the detection of dead link method and processing methods.
How do I check if a Web site has a dead link?
Perhaps a lot of webmaster for dead link This concept is not very understanding, because the general webmaster uses most of the programs are downloaded from the Internet, and these programs themselves will have some of the content of the article, if we do not remove him thoroughly, naturally in the operation will leave some open or no content of the page, These are called Dead links or invalid links. How do we detect which links are dead links? There are several better ways.
1, the use of webmaster tools inquiries. Webmaster's home tools should be used, which we use this tool to query their own web site there is a dead link. As shown in figure:
2, manually check the site's homepage link. General dead links exist in the home page to optimize the impact of the site is the largest, naturally we can put the first page of all the links are tested to see whether the normal open, and the tool query is also the death of the query home link.
3, view the Web site IIS log. This is the most thorough check dead link method, the log returned is 200 is the normal link, returned is 404 is unable to find the file is dead link, return 304 page without any update changes. Then we analyze the IIS log to see which pages return the data is 404, naturally these pages are not access to the spider, and has been treated as a dead link.
How do you deal with dead links in a Web site?
1, the use of ROBOTS.txt shielding. When you set up ROBOTS.TXT, you first query which links are dead links, and then add the special characters that are included in these links to the ROBOTS.TXT, such as http://www.***.com/boe/bsoe/?id=1. HTML with such a link is not normal to open, then we can naturally in the ROBOTS.TXT/?id=1.html this page address to screen out, tell the spider this is not able to crawl. Of course, the use of ROBOTS.TXT only a small range of shielding, only suitable for some of the pages of the personal site, for the rich and the link is not the law will not work.
2, use 301 redirection settings. ROBOTS. TXT is only suitable for small web sites, and for medium-sized sites to use ROBOT.TXT can not be completely blocked off, because some links are not open and not the law. This time we need to use 301 redirects, not open the page all redirected to the new page, the use of 301 redirect project A bit large, for just the first contact line station novice a bit ambiguous, but 301 redirect is more suitable for medium-sized Web site use, because 301 redirect will not put the weight of the page to lose, And the search will be based on the 301 redirected page to the original weight to the new page.
3, use 404 pages to guide the dead link two visits. For large content-rich Web sites, if you can not allow users to access the invalid link when the good guidance of its operation, will certainly make the site too high jump rate, and the dead link is a large site optimization details, processing can let the user jump rate lower. In fact, on large content-rich sites using 404 pages to guide the dead link two visits is the most effective, and 404 pages can be a wide range of use, do not worry about the link is not a rule and can not be blocked, but also through the new 404 page to let the user, your webmaster home of the expert-type, Lugo small game-type , these are very good 404 pages. such as webmaster home of the 404 settings, the author favorite.
For the site optimization, timely the site's invalid links disposed of, can make the site optimization effect better, but also make the site to search more friendly, but also can make spiders crawl more smoothly, crawl more fully. If the site has a dead chain to take more than timely processing, then the site snapshot ranking weight will slowly decline, after all, search is not like a dead chain to take over the site, because more dead links and less directly affect the site in search of the "image" is good. Therefore, the processing of a good dead link is a site optimization must do a lesson. This article by www.taob178.com Amoy Baut sell exclusive feeds, A5 start, reprint please specify, thank you!