A webmaster has the courage to give up the current situation of IP over million site revision, although know that the revision will bring a certain impact on the site, but dare to do so, I think at least there will be a way to deal with, but the fact is that the site step-by-step towards the weak, the author is also the site revision of the participants. The following author will share how the site is a step by step towards the decline:
Let's look at some of the details of the revision.
1:meta label Perfect. The original site meta tag is very bad, the keyword and description are empty, although the weight of the two labels given by Baidu is not as big as before, but from Baidu statistics can be seen, they still have a role, so improve the META tag is a correct choice.
Optimization of 2:url structure. The original site URL is a dynamic form, and with a lot of parameters, although the search engine can still crawl, but compared to static URL, dynamic or inferior. This aspect of the revision is also in line with common sense, the right choice.
3: Site structure optimization. This can not find the previous site screenshots, so there is no way to screen out for readers to watch. But the author can be sure that it is really improving the user experience, so it is also the right choice.
Or have friends think that the original site so bad, is how to have such a good IP traffic? We all know, determine the weight of a site, ranking factors have a lot of reasons. Here I can only say that this site has a lot of original content, and there are links to the construction of outside optimization, which is enough to make a site has good traffic. Then, to the above some of the revision, as well as with a very original content and outside the chain of construction, such a site and how will it weaken? The following is a specific reason:
Improper operation of page 404
In fact this is a detail of the operation does not pay attention to the cause. Just the Jump 404 page "http://" written "http:///", Just a "/" caused the jump page does not exist. Although it is a low-level error, but the existence of a large number of dead links also lost some traffic. But soon after the correction, the impact of this error is not big.
Second, generate static page server problem
URLs are static, but when some pages are opened, the page does not exist because the first time some pages access the static URL is not generated. For this reason, programmers do a static URL jump dynamic URL decision, it is difficult to imagine what the significance.
Third, duplicate page generation
A page uses a static URL and a dynamic URL, which produces a duplicate page. We all know that this is very unfriendly in search engines, and that's exactly what happened. So the author in their insistence to use static URL in the case of the use of the temporary shielding dynamic URL measures, the site ranked instability has indeed been a certain degree of control.
Iv. repeated changes
In the knowledge of the generation of static page is not successful, and the use of dynamic URL, undo dynamic shielding. So repeatedly the operation produced a large number of dead links, so that the site was once again by the search engine into the framework of distrust.
This column gives us the reminder is: the revision of the strategic concept is good, but the actual, to do their own controllable scope. A lot of unexpected things, to do a temporary strain. Article by the Czech Republic http://www.jieyitonggo.com/Original, reproduced please indicate the source, thank you!
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.