How to better handle duplicate content from two error-optimized case studies

Source: Internet
Author: User
Keywords Two

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

A site in the long-term operation can not be circumvented will appear the same content. If a site appears a large number of similar or similar content, it is conceivable to visitors and search engines are very unfriendly. If you have the same repetitive content problem is very serious may be defined by the search engine is spam and information, and eventually lead to unnecessary K-station problems. According to the author's experience, there are several reasons for repeated content in general.

1, the content of the repeated update: This is because we are published content may intentionally or unintentionally submit the following repeated articles, in the final analysis is not write duplicate title verification function, we can see that A5 for repeated content has very strict restrictions, the same title will be prompted not to submit.

2, dynamic, static, pseudo static URL address is not handled well. Dynamic URL and pseudo static URL address will produce two different URLs but the same content, or the same content dynamic URL and static URL address situation, the format of the URL address is not standardized easily duplicate content problems, but also the biggest impact

3, URL Web site is not standard. For example, www.hnyinshua.com/and www.hnyinshua.com/index.html have the same substance.

The author also faced a large number of sites repeated the issue of content, just started because the author first contact with this problem. Silly to think that we simply delete the duplicate content page, so directly manually will all the sites appear on the duplicate page delete, thought can solve this problem, but later found that the site's keyword ranking drop directly, and the site appears a large number of dead links, view IIS log, found a large number of search 500 status code. These issues can then be resolved only with 404 error pages. And the cost to pay for it is huge. So I think for repeated content, especially a lot of repeated content should not blindly delete these content, otherwise it will pay a huge price.

In the second time the author encountered a large number of duplicate content, the author no longer take reckless behavior to delete these contents. Instead, add the rel= "canonical" Preferred URL label to the duplicate content page, and then jump to the real page for the duplicate content. But there is still a problem, and the problem is even worse, in the second days after the author query keyword Baidu rankings, found that all the rankings disappeared, check the IIS log, found that a large number of pages returned is 302 status code. The site has a large number of 302 pages, it is likely to be the search engine to determine the spam content. I immediately canceled the jump, the interval of two days after the keyword ranking finally recovered. After this more than I think in doing a repeat of the jump when we preferred 301 jump, and then can not achieve 301 of the case on the use of rel= "canonical" Preferred URL tag.

After two failures of repeated content problems I think we must not delete the duplicate content, otherwise there will be a large number of dead links. For duplicate content preferred 301 jumps, it is impossible to implement the use of rel= "canonical" label.

In summary, the site's optimization operation errors we can not avoid as we can not avoid duplication of content, how to summarize the error is the most critical, I hope this article for everyone to help.

This article is submitted by Changsha advertising company http://www.hnyinshua.com/Analysis, reproduced please retain the source.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.