See a lot of friends often ask my station has been more than 10 days or nearly one months, has not been included, what is the cause, my answer is the number of spiders to come every day, the other answer is not know, the results of the IIS log can easily be overlooked, but often easily overlooked is the most important thing, We in the Shen Edge SEO: Web site Barometer IIS log (a) inside how to get the IIS log, here is a look at the IIS log can tell us about the site information.
First: Through the IIS log we can see that every day there are those spiders come to our site, and know the number of times, crawling whether successful. As long as we see the spider Daily care of our site, we can analyze whether spiders like our site. be included, and put out is only a matter of time.
Second: You can see when spiders come to our site, crawling the site of those pages, there is no successful crawling, and included. Whether to encounter dead links, such as Crawling back code is: 404, then this link is dead link, the return code is 200, is a successful crawl, 200 2869 1156 Such code is successful crawling and included, 200 0 0 is the success of crawling but can not be described included; If the return code is 304, there is no new content, and so on.
Third: You can know spiders crawling our site, the site's server is stable operation, if you see a large number of 500 code, then the server problem, in the IIS log code with 5 code to the beginning of the message to our information is the operation of the server information.
Four: When our site appears, we will find that the number of spiders will continue to be less, then our first reaction is to check our previous operations or is to do the link to our website whether the situation has been encountered. Combined with the site's snapshot update, the chain of data, the inclusion of the situation to make a relatively accurate judgment, before being punished immediately to respond to the loss of our site to a minimum. So we do in the daily operation or to exchange links for others when the record, is a seoer must develop a good work habits.
Five: If we found 404 of the code in the IIS log, we can use robots to screen these links, not to allow spiders to crawl; If you don't have the habit of checking IIS logs, you don't know even if there are dead links, let alone shielding the dead links. Spiders have a lot of dead links to the site is not good, and will not give such a high weight of the site. Some friends say check dead links I can use webmaster tools, such as Chinaz dead link detection, Google webmaster crawl errors and so on these tools, but do not forget that the tool also has a wrong day, as the IIS log response is real. Analyze IIS logs we can use some IIS log Viewer to reduce our workload and save our working time.
Every seoer has to develop a good habit of checking IIS logs every day and making records, so that once the site encounters a situation, it will have a query. At least four days to query, because our IIS log in the virtual host to save the longest time is typically 4 days.
The above five points of information is through the interpretation of the return code in the IIS log, so it is important to read the code, many large birds have summed up this, the need for a friend can search a lot of relevant information can be found. The deficiencies also hope that the big birds to add a lot.
This article by Shen Edge SEO original reproduced please keep the link, and note the author thank you! Welcome to discuss learning www.chenyuan66.com qq:32092216 together