As our optimizer, you may receive a warning about the site's repetitive content in your daily optimization. I'll explain how to deal with repetitive content today in this article to make sure it doesn't affect your search engine rankings. First of all, you may want to know what the meaning of the duplicate content is.
The repetition of the content is simply that on your site two different URL addresses have a lot of the same text. This can happen when you want to give your users a friendly print page, and you add a page with no extra ads and navigation to your users to make it easier for the user to print, but then the original page and the printed page will be considered duplicate pages by the search engine. Even the additional mobile sites of your site may be considered duplicate content. And so on, it's easy to duplicate content. Of course we do not rule out that you are intentionally adding duplicate content, because a large number of different URL addresses duplicate content may bring more traffic to your site.
If duplicate content is so common, then you might want to know what Google will do with the duplication. The answer may be more complicated, because the answer depends on whether Google's algorithm thinks your content is duplicated to deceive search engines.
If Google's algorithm confirms that you are using duplicate content to deceive the system to get more traffic, then your site will be penalized. Maybe even your site will be deleted by Google.
If Google thinks your duplicate content is not "malicious", they will simply pick one of the pages to display in the search results. This means that your printed page may be crawled and displayed, but the page containing the ad page will not be displayed. Your site visitors may not be able to see the pages you really want them to see.
There are several ways you can make sure that the pages you want to display are crawled by search engines. The easiest way to do this is to use the NOINDEX tag to make your duplicate pages not be crawled by search engines. Of course, we can also through the Robots.txt file, to prevent the search engine spiders crawl Some areas of directory files, to achieve the effect of shielding duplicate pages.
A successful site needs to pay attention to a lot of problems. About duplicate content This problem is we cannot ignore the important question, the reasonable processing duplicate content, can make your website to obtain the better development. This article by the seedling Pig price http://www.zhongzhu666.com/handwritten original, reprint please indicate the source, If you have a good way to communicate with each other, the author qq:1051118541.