Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Measuring the quality of the site is included the more the more the better? Before our personal webmaster in the site included above all have a standard, is a lot of webmaster to evaluate the quality and quality as a standard. But we are included in the site at the same time, received a lot of surprises, but included above in particular, Baidu, will always appear frequently ups and downs included instability, but the weight of the site is no doubt that the content of the rubbish is too much impact. Therefore, I think the site's collection can not determine the weight of the site. and the appropriate reduction of the site low quality pages included, for the development of the website is beneficial.
One, reduce the number of repeated collection in the website
I do not know you have seen the article Baidu optimization, if you believe that the standards in the Baidu Guide, will certainly be from different pages to see the URL, but different URLs for search engines, is from the main criteria to distinguish. As a search engine, how to choose a standard link, duplicate included page is the content of the same site is very unfriendly. Tell the spider, do not let it crawl to use the form of jump, a series of URLs as their own settings page is also possible.
Two, shielding to spiders crawl unfriendly page
The site's low quality page for spiders since it is not friendly, we have to find a way to shield off. The general choice of shielding mode is the user has different evaluation criteria. This time, the screen search engine friendly pages, for the future development of the site is also very important. Therefore, 8630.html "> Sometimes the communication between users will not only affect the weight of the site and screen standards, in such a strategy above is the two worlds."
Third, the screen Web page of the dead link
Website development, there will always be some dead links, which we can not avoid. For example, we deleted an article, we changed the address of the article and so on. These are in a section of the article, is the search engine has been crawled to the article, was you changed then became another link page, then in the search engine when it will become a dead link. So, when we delete the article to change the link address of the article, we must remember to immediately block out.
Four, screen off the background of the website
Our website backstage is oneself may enter, does not want to be seen by the user. Then the spider should be shielded. Typically use robots.txt files for masking.
Summary:
In fact, no matter what type of shielding, as long as the spider can not crawl your site low quality pages can be. Now the search engine for the site's page quality requirements more and more high, if our site to long-term development, we must do these important work. This article by: said the encyclopedia http://www.gexings.com provide, reproduced please indicate the source, thank you.