Absrtact: The site upgrade is an indispensable part of every site's life course. And how to deal with the problems that arise during the upgrade test everyone who wants to succeed. In the site upgrade process we will inevitably encounter dead links, especially which some of the included
Site upgrades are an integral part of each site's life course. And how to deal with the problems that arise during the upgrade test everyone who wants to succeed. In the site upgrade process we will inevitably encounter dead links, especially which some of the large number of sites. And this some of the dead link will not only seriously affect the weight of the site, and if there are a large number of dead links will cause spiders can not smooth crawling your site, resulting in a collection and snapshots are not keep up. So how do we deal with the dead link caused by the revision? I will simply share my three points of experience today.
One: Use the Robots.txt mask or use 301 redirect
For this method is a lot of webmaster the longest way to use one. But this will also encounter a problem, that is, if your site's content is very large, as the author's site, in the revision before the upgrade included 263,000, as shown in the following figure, can you use these two methods? Because the author's site after the revision, the content level distribution has changed, So there is no way to use a simple file screen, can only use 301 redirect, but such a large amount of collection, the use of this method is the most time-consuming method of consumption.
Two: direct use of the 404 error page
This method for a large number of sites is still very useful, the site will be the existence of dead links directly to the 404 error page. And then through the 404 error page will guide the user to the revised pressure surface. This can reduce the loss of site traffic, so that users can find sites. For the 404 page Jump Time setting, I think not too short, preferably in eight seconds to 10 seconds, and the page has to entice visitors to click on the link, as shown in the following figure, so that users can click on their own to better than direct jump.
Third: Adhere to the content of the stable update
For a newly upgraded site, a stable update of the content is critical, so that not only quickly in the upgrade to attract spiders crawling, but also can let spiders crawl old data to replace. Search engine spiders usually take the breadth first, in the form of hyperlinks to crawl sites. Therefore, we are in a stable update site content at the same time to reasonably carry out the construction of the chain, a step-by-step stable to attract spiders, to the content of the new collection, delete the old included information effect.
I think for a large number of sites included, the most effective and direct treatment after the revision of a large number of dead links is the second and third. Using these two methods can not only reduce the loss of weight, but also reduce the flow loss.