Webmaster how to conduct Web site log analysis

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

A qualified webmaster or seoer must be able to read the Web site's server log files, this log records the site was crawled by search engines traces, to provide a spider whether the visit of the strong evidence, webmaster friends can through the Web site log to analyze the search engine spiders crawl situation, Analyze the site to see if there are any abnormal problems. And we can according to this log file to judge the frequency of spider visits and capture rules, which will be very conducive to our optimization. In addition, learning to analyze Web site log file is also a webmaster must have the ability, but also you from a primary SEO advanced to the only way to SEO master. But the premise is to host services to open the log statistics function, the general virtual host provider will not open, you can apply for the open, or to the server management of the background to open this log statistics function, but the log will occupy space, we read the log file, can be cleaned up under the log file. So how do you analyze the server log files? Listen to me.

Search Engine Crawl website information will leave the information on the server, this information is in the website log file. We through the log can understand the search engine access, generally through the host service to open the log function, and then through the FTP access to the root directory of the site, in the root directory can see a log or Weblog folder, which is the log file, we put this log file download, with Notepad (or browser Opens to see the contents of the Web site log. So what is hidden in this diary? In fact, log files are like black boxes on airplanes. We can get a lot of information through this log, so what exactly does this log deliver to us?

If you want to know what the Web site log file contains, you must first know the names of the search engine spiders, such as Baidu's spider program name is baiduspider,google robot program name is Google-googlebot and so on, We search the contents of the log of the above spider name can know which search engine has crawled through the site, here left their clues. Furthermore, you must be able to read common HTTP status codes, the most common HTTP status Code has 200 (page crawl success), 304 (last crawled and this crawl of the unchanged), 404 (Not found page, error link) 500 (the server is not responding, generally by the server maintenance and failure, When the site is not open, these status codes are our Webmaster friends must be able to understand, the server status code is the value of our communication with the spider signal. Know the basic information after we can according to the site log analysis, generally we only look at Baidu and Google spiders crawling and crawling situation, of course, there are special needs can also be a few other spiders crawling situation analysis. The website log appears a large number of Google spiders and Baidu spiders, the search engine spiders often come to patronize your site.

When it comes to parsing log files, we have to say the time to parse the log files, so under what circumstances should we analyze the log files? First of all, the new site has just been established, this time is also the most urgent webmaster friends, we will be anxiously waiting for search engines to include the content of the site, Often do is to go to Baidu or Google command site: The next site to see if the domain name is included, this time, in fact, we do not need to frequently query whether the site is included, to know whether the search engine is caring for our site. We can use the Web site log files to see, how to see? Web site Log whether there are search engine spiders to crawl over the site, see the return of the status code is 200 or other, if the return 200 description of the success of the crawl, if the return of 404 Instructions page error, or page does not exist, You need to do 301 permanent redirects or 302 temporary redirects. General crawl after the successful search engine put out the time will be later, the general Google robots put out faster, the fastest seconds to kill, but the Baidu response is slow, the fastest will be about a week, but the November Baidu algorithm adjusted, put out the speed is very fast. Secondly, when the site is included in the anomaly we have to the normal recorded log and abnormal log for comparative analysis, to find out the problem, this can solve the problem of site collection, but also for the full optimization. Third, the site by the search engine K, we have to observe the site log files to mend, generally this case, only a few log files of spiders crawling home and robots, we have to find out the reason for the K and correct, and then submitted to the search engine, Then you can observe the log to see whether the spider is normal, slowly over a period of time, if the number of spiders increase or often come and return to 200 states, then congratulations, your site and live, if the six months did not respond, then suggest to give up the domain name again war.

Many webmaster friends do not know how to use the Web site log files, encounter the site included questions to ask others, but not good self-test, this is as a webmaster or seoer sorrow. And a lot of soft text on the Internet are mentioned to do a good job of the log file analysis, but it is only soft, perhaps the author of the article did not go to see the log file. After all, or hope webmaster friends must not ignore the Web site log files, reasonable use of good Web site log file is a webmaster or seoer necessary skills. Moreover, read Web site log files do not need you have how advanced coding knowledge, in fact, as long as you can understand the HTML code and a few return status code on it, must not be lazy, or holding a lucky mind to treat your site, this mentality will cause you to lose very miserably. If you are a small webmaster, or you are a seoer, if you have not been aware of the importance of the site log file, then from the see I wrote this article to start to take good care of your site log. This article by the trend of shopping www.xiaotao5.com original, thank Webmaster Network!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.