If you do not update your site snapshots, snapshots back or even the site is down right, you will definitely go to see if the site has regularly updated the original, as well as to see what the site links out of the problem, in fact, in the Baidu search engine, snapshots back, not update are normal, And this is not your site and your site has a link to exchange the site problems, this is just Baidu to the site of the snapshot stranded or not put out, you have to do is the daily update the original and release the chain. So how to judge, your site appears these situation is not your own problem? It depends on whether you observe the site log every day:
Website Log analysis is always the most complete is also the most accurate, through the site log we can know the search engine spiders crawling in our site and stay time, as well as the query site content is included in the site to know those places in the spider access to the error message! Well, to learn the benefits of observing logs, I think you may want to know how to do log analysis, so now Taizhou seo to tell us how to do Web site log analysis: To do the site log analysis first have to get the site log, I think there are many people in their own web site root directory can not find their own web site log, That's because you're not in the background of the host to get web site log function, as long as in the background of the host to get the Web site log, then the next day in the site root directory next name log folder to find the site log. Find the site log, download it to the local, because the log is recorded in the spider's crawling state, this we may not understand, we have to use a tool (light-years log analysis tools, direct Baidu Search can download to!) This tool can help us to quickly analyze the Web log, directly to the conclusion presented to us, we have to do is to create an Excel table to record the data, these data to long-term adherence to record, so that we can analyze the spider's long-term crawling trend in our website so that we know the direction of the development of our website!
Data is always the most direct and persuasive, do SEO to focus on data analysis, this should be as a seoer the most basic skills. On the record log form should include the amount of capture, crawl times, crawl time, no repeat crawl, the average crawl of a single crawler, the average crawl time of a single crawler, crawler crawl the number of 404 pages encountered! Take the time to record this data every day to see how your site is going, Instead of empty feeling how, do for SEO should learn to use data to speak, because often the data is more persuasive than imagination!
Original URL: http://www.suting52.com/445.html reproduced please indicate the source
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.