Effective resolution page Repeat collection three methods

Source: Internet
Author: User
Tags repetition

For a page in search cited repeatedly included several times do not know is good or bad, and according to the author, such a situation is really not a good phenomenon. After all, this will be to their own site included too much repetition, but not conducive to site optimization. Therefore, the Site page has been repeatedly included, according to the author's experience, the weight of the rankings have a certain impact, so, to solve this type of problem, to help the site included towards normalization is necessary.

A page has been repeatedly included several times, this is not conducive to site optimization and enhance the site weight rankings. Can only increase the amount of collection, but not long. But also affect the quality of the site. The same article is repeated three times, so that the weight of the site will be dispersed. Therefore, for such a problem, the webmaster should not think of it as a repetition of the more the better, this can increase the amount of collection, in fact, this really useful? Just as things are expensive, why is the pearl so expensive because it is relatively scarce, if the large-scale popularization of the words will be expensive? There are several ways to resolve this situation:

  I. Use of robot to screen other web sites

Each site will carry two URLs, especially those to generate static Web sites, itself there is a dynamic, natural generation of static will be more than a Web site, so when this article is included, two URLs are included, which is why sometimes, clearly update three of their own but included six or even more, Because this search leads to the same article two URLs as independent, natural two are included. In fact, many times, webmaster all want to collect more of their own site, but the same article repeated several times, for the site optimization does exist an increase in the phenomenon, but not long, similar will be deleted, after all, the same content exists search cited database will only occupy space, so, This is why today's collection of so many, after a few days without the reason. For such reasons, the use of robot shielding off the dynamic on the line.

  Second, the timely elimination of the cache

Caching is mainly due to the improper setting of the space or the performance of the space is not good, because the performance of the space does not exist cache, and because of the speed of access, nature will not have a cache said. A lot of space will have such a caching mechanism, for a second to collect the site, cache is not in time to clear the search will only let the same article included a number of times, because the first you add the completion of the natural will be left in the space edit the cache, and when you are all updated, then generate static or pseudo static, A cache appears again, and then a URL is generated. This will result in three URLs. And some programs will automatically save, automatic save is the backup of course there will be a cache, or how to restore it. So, for such a cache is not clear in time, then when the spider crawling, will be together with these crawl, naturally produced repeatedly collected. Therefore, the timely elimination of the cache for the resolution of the page to be repeated collection is helpful.

  Third, use the website map to guide the spider

Spider crawling is the first crawling robot file, then then is the site map, moreover is the homepage. So by setting up a map to the site not only can reduce the site repeat the phenomenon, but also make spiders crawl more smoothly. All know that the site allows spiders to crawl smoothly when natural to give the search to lead a good impression. The website has the stability and unobstructed nature. Many webmaster for the site map does not mean, in fact, the site map is like the floor plan of the building, so that the spider to see the full content of the site, naturally crawling up more convenient, and have a complete site map, and each update to generate a map, for the site itself has certain help. And through the map can effectively reduce the repeated collection of the situation. Therefore, the role of the site map is not only the content of the whole station summed up, but also can make spiders crawl more smooth unimpeded. Reduce the page repeat collection that is brilliant at publicity.

In order to increase the number of sites included in a page to include many times, in fact, is not a good phenomenon. After all, this will cause the same content too high repetition, and spiders can not tell which is really original, the weight will naturally be dispersed. Therefore, do not be greedy for this petty gain, the negative consequence is still very big. This article by http://jf50.com/diet pills which good feeds, reproduced please leave a link, thank you!



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.