The role of the server IIS log as a webmaster, you know?

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

For a seor, the server IIS log is a very important optimization reference log, because we can see the search engine spider crawling situation, and also can understand the site itself some of the situation, can also be analyzed to some users of the channel, Not necessarily to use some Third-party code to do statistics, of course, some IIS logs by the IDC space business restrictions, must be opened to see otherwise can not view, if this is the case, we recommend that you download some source code to install, before you can. The following author on the specific and explain some of the server IIS to help the Web site!

(a) Look at the number of spiders crawling.

Search engine spiders are used to crawl the content of our site A robot, which will crawl the content to submit to the search engine database, so we understand the number of spiders crawling, indirect also learned that our site is the search engine's favor. In addition, we can also through the comparison of the spider's antecedents, you can fully understand which outside the chain of our site more useful, spiders are more valued, you can also understand our site which page content spiders like, never to follow.

For those harmful search engine spiders, we need to shield them off, because after all, spiders are divided into a lot of species, if we find a lot of unidentified spiders one day in our site a lot of crawling, this time we should be shielding to avoid the right to drop, In addition, I urge you to try not to use those who attract spiders crawling to their own web site.

Search engine spiders A large number of crawling our site, there is no doubt that the resources of our servers have a certain occupation, but spiders a lot of crawling on the weight of our site is a great help, so I suggest you must choose a better server to avoid the collapse of the critical moment. The other is through some official bulletin to understand that a lot of hackers use spider crawling mode to steal the site's data resources, so we also need to pay attention to!

(b) To see how the spider crawls the page.

Generally speaking, the spider most valued is our website homepage, therefore the General homepage Snapshot Update is the most frequent. And if the internal page is also frequent, it will reach the second we often say such an effect, if some of our pages are not crawled by spiders, we can through the ISS log to see if we say prohibit spiders crawling. In addition, I understand that many webmaster in the process of doing outside the chain are home address, here I urge you to do some more columns as well as the outer chain of the article page, so that our collection is a great help, and from these we can also understand some of our site conditions, such as the Spider is from which pages into, Which pages crawl relatively frequently, which pages crawl not crawl, also did not include it, so we summed up, we can more accurately understand what the spider more favored content, and also can see that spiders are interested in our content, or because of the role of the chain crawling.

(iii) Web site HTTP Code analysis.

When spiders crawl crawl the content of our website know usually leave an HTTP status code, generally return to show 200 represents may not directly release, that is, need an audit period.

The following author for this piece made two ask oneself answer, hope to webmaster help.

1, when the site does not exist this page should return 404 or 200?

The first answer must be to return the 404 value is correct, because we all know that 404 pages can tell the search engine this page is not accessible to the wrong page, but if it is 200, it is not the same, because it represents this page can be crawled, but when spiders crawl when found is inaccessible state, So too many such pages will directly cause our site by the search engine down right even by K.

2, when facing the site in the construction state or the record, should return which status code?

The answer is 503 states, because 503 can tell the search engine our site is only temporarily unable to visit, in a certain time will be restored, if it is other status code, the search engine will not be able to access it, especially the 404 page, will let the search engine directly think that the site no longer exist.

For 404 pages in fact, our site is also a good help, so we recommend that you remember to do a 404 page of your site Oh.

(iv) Use of professional tools for log analysis

PV value for a site is undoubtedly a user experience of an expression, when our site jumped out of the high, then the site is in a can not open or poor content of the state, then the site rankings do not say and cicada, can not find out what page access is high, This allows us to explore the needs of the user to improve. But if our site is not open for a long time, or the visit is slow, then we can see from the log whether some malicious traffic attacks, in the face of this situation we either compromise, or report, or directly to change the server.

For a website, if users do not click on your site, it is enough to show that your site is not attractive, search engines will also think that your site is not a good site, so it is very important to do the user experience studio. Finally, we recommend two more professional IIS log analysis tools:

1. Awstats,

2. Webalizer

These two tools can also be analyzed to some of the site's status code (⊙o⊙) Oh!

Summary: The Web site's IIS log for a site optimization to the user experience in all aspects of the work is a good help, we can add to the understanding of our site, some of the small details we have neglected to pay attention to doing a good job, so that our site naturally can get the search engine's favor, And in our site by K or down right before you can submit to get some understanding oh, this time we carry out targeted improvement, many times can be avoided. This article by the name Net Http://www.name2012.com original share, hope you reprint friend, remember to retain the link and copyright, here to thank you. Well today to share with you here, I will be a lot in this platform to communicate with you, then next time we see.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.