Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
All Web sites are unable to avoid duplication of content, which is a common problem in the operation of the site. If a site has a lot of similar or similar content, the user experience and search engine friendliness of the site will be greatly compromised. Serious will also be the search engine as a garbage station and can not get a good collection and rankings. Through the Google Administrator tool on the site repeated content analysis, found that the content of the site duplication of reasons are as follows:
1, repeat the content: This is the Site editor upload article in the process of uploading the same article many times, in the final analysis is not to write duplicate title verification function;
2, a number of calls to the same content: Dynamic address pseudo-static will produce two different URLs but the same content of the situation, the same level of each other to invoke the article will produce such a situation.
3, URL Web site is not standardized: for example, www.abc.com and www.abc.com/index.html substantive content of the same situation.
4, page page and content pagination: The list page and content page titles are the same, will also be judged to be the same content.
After identifying the content that causes the site to be repetitive, just started to take a blind measure, the first direct deletion of the Web site Editor update caused by the duplication of content, resulting in the overall site decline, resulting in a large number of Web page 404 links, the search engine included the page also returned a large number of Spider 500 status code. In order to make up for the loss caused by the misoperation, the timely disposal of these deleted 404 links in the Web page, took a lot of energy to solve. So for repetitive content, especially a lot of duplicate content, we must not choose to delete directly, this is the stupidest way.
Take the second time by adding the rel= "canonical" Preferred URL tag to the duplicate content page, and then repeating the content to jump directly to the original content. The situation is worse, the next day Baidu keyword ranking all disappeared, check the duplicate page after the HTTP status code found that the return of the 302 status code. Site has a large number of 302 pages, most likely by the search engine to determine the spam. Immediately undo the jump, not a day site keyword ranking all restore. So for repetitive content preferred 301 jump, can not achieve 301 of the case on the use of rel= "canonical" Preferred URL tag, must not jump directly.
The best solution for duplicate content is: Be sure not to delete duplicate content directly, resulting in a large number of 404 errors. Preferred 301 Jump, it is impossible to implement the case on the use of rel= "canonical" Preferred URL tag, must not jump directly.
In short, repeated content is the site can not evade the problem, we can only try to prevent and to solve, when dealing with the time must be considered comprehensive, to avoid causing misoperation, to the site to bring irreparable damage.
This article by QQ expression http://www.jiongyaya.com/actual combat original share, reprint please specify!