Qualified SEO should be true for IIS log analysis

Source: Internet
Author: User
Tags iis log root directory

There are some problems that have plagued us in our path of optimization, such as: do we have an effect on the outer chain of our hair? Can we buy a stable space? What do spiders like about our pages and what do they dislike? When spiders crawl our site frequently, when do we need to update content? And these problems we can actually through our server IIS log can be simply analyzed, as a qualified SEO, the real analysis of the IIS log can be rather than simply to see the status code, in this reminder to buy space when you must buy can download the IIS log, can not download the buy do not buy , here are some points to analyze the Web site problem through the IIS log.

I. The important role of the IIS log

1. Through the IIS log can understand the spider on the site's basic crawl situation, can know spiders crawl track and crawl amount, through our IIS log, outside the chain of more and less and the site spider Crawl quantity is has the direct influence, we said the link bait is if you have made an outside chain, Spiders crawl This outside the chain page and put the page out, spiders can use the link you left to crawl your site, and the IIS log is to record the spider's crawl action.

2. The updated frequency of the site and the IIS log spiders to crawl the frequency, generally speaking, the higher the frequency of updates, spiders crawl more frequently, and our site update is not just the addition of new content as well as our fine-tuning operation.

3. We can according to the IIS log response situation, to our space some things and problems ahead of the alert, because the server if the problem in the IIS log will be reflected in the first time, to know the stability of the server speed and open speed both will directly affect our website.

4. Through the IIS log we can know that the site of those pages are very popular with spiders, and which pages are not touched by spiders, and we can also find that there are some spiders due to the excessive crawling of our server resource consumption is very large, we have to do shielding work.

second, how to download the log and log settings considerations

1. First of all, our space to support the IIS log download, this is very important, before we buy space must first say hello support does not support IIS log download, because some service providers do not provide this services, If the support of the space background generally have log weblog log download this function to download it to the root directory in the FTP upload to the local can, the server can be set to download log files to the specified path.

2. Here's a very important question, IIS logs are strongly recommended to build once per hour, small enterprise stations and less content of the Web site can be set to one day, it defaults to one day, if the content of more or large stations set to generate one day, then one day only generate a file, this file will be quite large, We sometimes open the computer will cause a panic, set up words to find space to coordinate settings can be.

III. Analysis of IIS logs

1. The suffix of the log is logged we use Notepad to open, select the format of the automatic line so that looks convenient, while using search function search Baiduspider and Googlebot these two spiders.

For example:

Baidu Spider

2012-03-13 00:47:10 w3svc177 116.255.169.37 get/-80-220.181.51.144 baiduspider-favo+ (+http://www.baidu.com/search/ spider.htm) 200 0 0 15256 197 265

Google Robot

2012-03-13 08:18:48 w3svc177 116.255.169.37 get/robots.txt-80-222.186.24.26 googlebot/2.1+ (+http://www.google.com/ bot.html) 200 0 0 985 200 31

We'll explain it in sections.

2012-03-13 00:47:10 Spider Crawl Date and time point

w3svc177 This is the machine code. This is the only thing we don't care about.

116.255.169.37 This IP address is the IP address of the server

Get Rep Event

Get behind is Spider crawl Site page, Slash on behalf of home

80 is the meaning of the port

220.181.51.144 This IP is the spider's IP, here to tell you a true and false Baidu Spider method, we computer click Start to run input cmd Open command prompt, input nslookup space plus spider IP click Return, Generally true Baidu spiders have their own server IP and fake spiders are not.

As shown in figure

It's a real spider, underneath a fake spider.

  

If the site appears a large number of false spiders that someone posing as Baidu spiders to collect your content, you need to pay attention, if too rampant that will be very occupy your server resources, we need to screen their IP.

200 0 0 Here is the status Code status code meaning can be searched in Baidu

197 265 The last two digits represent the number of bytes of data accessed and downloaded.

2. We analyze the time first look at the status code 200 on behalf of the success of the download, 304 for the page has not been modified, 500 on behalf of the server timeout, these are general other code can be Baidu, for different issues we have to deal with.

3. We want to see which pages spiders often crawl, we have to record down, analyze why they are often crawled by spiders, so that the analysis of spiders like content.

4. Sometimes our path is not uniform with slashes and no slashes, the spider will automatically recognize the 301 jump to the page with a slash, here we found that the search engine is able to determine our catalog, so we want to unify our catalog.

5. We analyze log analysis for a long time, we can see the spider's grasp of the law, the same directory below the crawl frequency interval of a single file and different directory of the capture frequency interval can be seen, these crawl frequency interval is the spider according to the site weight and site update frequency to automatically determine.

6. Spiders for our page crawl is graded, is based on the weight descending, the general order of the Head page, catalog page, inside page.

7. Different IP spiders They crawl frequency is not the same

IIS log is our analysis of the site's important reference data, we should often analyze it, summing up experience, so that we have a number of issues at our fingertips.

This article by http://www.51diaoche.net original A5 Welcome reprint



Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.