Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Originally normal website, suddenly, become not included, to Seoer means the coming of the time of the suffering, tragedy, the author of the station in the previous period of time, encountered this situation, fortunately, after the inspection and processing, the site returned to normal. Here based on the actual example of the author, said that the site suddenly does not include the reasons and solutions.
General situation: The author of the station about 12.15 days, the original daily update of the normal daily collection of information page is not included, then the other site collection began to reduce, to 9.23 days, the site began to stop, the snapshot began to stagnate, the site keyword nanjing seo ranking decline.
Because the site is not included in many reasons, so check the site spent a lot of time, of course, is not blind, the need to have a clear direction in mind, the site does not include the overall situation is no more than 3 points: 1, spiders do not come; 2, spiders come, can not find the page, they left. 3, the spider came, also entered the site of some pages inside, but still did not bring anything. From these 3 reasons, I did the following check:
1. Check the IIS log. Through the IIS log check, you can clearly understand the whereabouts of spiders, it has not come to our website and when, and how often to how much. If the spider did not come, the site will naturally not be included.
2. Check the channel. If the spider is normal come to your website, So the first thing you want to see is your robots.txt file, see if you are not careful when you modify the robots, the original need to be included in the normal page to prohibit, or is not because of their prohibited pages to be included in the page of the only entry or the main entrance also prohibited. About the robots file should also note that not frequent changes, because each time your changes will let the spider reconsider, which pages are to be, which pages are not to crawl, the frequent changes of spiders are also very annoying; In addition you have to check that your site page of the various entrances is not normal.
3, check the page, if the spider came, and your robots and the same as before no major changes, structure and page of the entrance is no big change, the problem must be out on the page. Article page, you need to consider the quality of their articles, is not too much to collect? is not the original article and so on, and to check their own article is not by others to collect too much (this is a lot of people do not check the habit), the article was collected by others too, if your weight than the collection of your article site, Perhaps you will let Baidu appear your station is collecting station, especially when your article is frequently collected by different stations; As for other pages, you have to see if you are in the new page, content is too similar, the title is not repeated, and so on, these spiders do not like.
Check your site for these 3 reasons, I believe you will find your site is suddenly not included in the answer. In fact, the author of the station after inspection, the problem is that there are several reasons, the author of the information page was collected by several people, and more frequent; and in the previous period, due to the site revision, there are several pages I think do not need to prohibit, did not take into account the other page entrance problems, resulting in not included.
Here are the solutions:
1, if you check the IIS log, found that the spider did not come, then your site is likely to be down in the right, you have to check friends chain; Check your server status, is not returned 404, 503 states too much, is not a lot of pages can not access; and, do not brush the flow, which is leading to the right to drop the main reason.
2, if the problem is in robots.txt, this is good to do, only need to change it correctly, remember to consider the link between the page, do not prohibit a page a serious impact to the B page.
3, if the problem is the page, then you have to do is to increase the originality of the article. Too much collection by Baidu is considered a garbage station, too much of others will be collected by the Baidu is considered a garbage station. To do a good job of checking, it is particularly noteworthy that careful to be used by the machine to collect, now because there are many similar locomotive collection tools can help many owners to reduce a lot of work, but if your station by this type of machine collection, will be very depressing things, you can do some restrictions on the page, such as: P, Div, span The code is interchangeable, and so on.
Through the above method, the author's website has been restored to normal, in the time of writing this article, the author has just updated the article is all included.
The site is suddenly not included will make people feel headache, but to clarify the idea, this is the most important to do SEO, do not go into a dead end, to pass a reasonable way out, and once you come out, you will find that your site is better than before, Baidu will be more favorable to your site. This article from http://www.seo1912.com, reprint please specify.