Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
In search engines are often included in a phenomenon, the same content search engine included more than once, which has "static page, there are dynamic or pseudo static page." This is to increase the amount of the site, but the search engine does not agree with such "duplicate content" included. Finally, the results are often light repeated snapshots are recycled delete, heavy or even directly affect the weight of the site. The author today to say is, in the site optimization by the search engine "repeat" is misfortune, if there is such a phenomenon, should cause the webmaster attaches great importance to make as good as possible, will not affect the site in search engine ranking effect.
One: What is the reason for repeated collection
In the site optimization process, search engines do not like to repeat the content of the site, on this point, the search engine has had a clear standard, even in Google "Administrator Tools" can directly see the relevant recommendations. But why repeat it?
A: Search engine after all, only man-made set of rules, do not fully recognized, so that the content of multiple URL address repeated.
B: Programming problems, some programs in the design for "preview" to facilitate the existence of dynamic pages in the background. However, due to improper design, the search engine still through the index of the directory can find this existence, and then the index included.
C: Template problems, in the template is often easy to ignore this problem, the final template after the completion of both dynamic pages, there are static pages, so how can we not be repeated by the search engine collection?
Second: What are the consequences of repeated collection
Website optimization to avoid duplicate pages are included in the search engine, but if it is included, what is the impact? In general, the search engine will be deleted after the dynamic page, but if frequently so repeated collection of culling, culling included, will eventually lead to spider aversion, directly affect the spider crawl frequency. Another point, if there is such a large number of sites included in the page has not been removed, but the end is a Web site to the growth of hidden dangers, in the future search engine big more heavy snapshots are seriously removed, will seriously affect the current weight of the site, the cycle, when the site can be developed? Search engines do not recommend Web sites that have multiple URL connections with the same content. It seems that even if the content is included more, it may not be a good thing ah!
Three: How to avoid the content is repeatedly included
Find the root cause of the problem, and then solve the problem, then how to solve this phenomenon, but also very easy: a:url standardization, B: Use robots.txt to screen dynamic files
A: In the construction of the website as far as possible to unify the URL address, do not use dynamic page link, one is the search engine favor static page, on the other hand also prevent the same content by search engines Repeat the outcome of the collection.
B: The use of robots.txt for the screen of dynamic files, such as: "Disallow:/*?*" Description: Such shielding is suitable in the absence of "tag tag" under Use, otherwise the label will be blocked. However, shielding can be used flexibly, under different procedures, different shielding can be.
Binary network together with professional website construction company Pilotage Technology (www.joyweb.net.cn) that: Website construction and optimization strict details optimization, content is misfortune, should be taken seriously, it is difficult to overcome the final ranking obstacles. Today, after writing this article, sincerely hope to help more webmaster to get rid of optimization problems.