The role of Web log research for SEO

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

The Ministry of the latest anti-Vice search record, Baidu Phoenix Nest System update, double blow, I recently on the line of the new website, performance is very bad, are encountering the same problem: Snapshot back file, the daily update included but unchanged, the overall flow of the site decline, the above problems have plagued us for a few days, we began to find the reason. The first thing to see is the site log, repeatedly see Baidu spider crawling route, plus links to Www.zhengxing88.com's column 12 hours later, think this kind of hair method can be very effective to help us find out the hidden dangers of the site ~

Learn about search engine Crawl pages:

Analysis of search engine crawler access, we can analyze the search engine contains some clues:

Is the search engine's recent visit to the site normal?

Which part of the Web page do search engines prefer to visit?

Which part of the Web page does the search engine rarely visit?

Did the search engine visit some of the content we were forbidden to visit?

Find out if the site content and links are normal:

The following issues can be analyzed by analyzing the status code returned by the server:

Whether there is a dead link;

Whether there are page elements have been mistakenly deleted, such as pictures, CSS scripts and so on;

Whether the server has a temporary failure;

Whether there is temporary redirection;

Whether has the authority control causes the search engine to be unable to crawl the data;

Web site log Research on the role of Web site security:

Understand the hotlinking of the site;

If a third party Web site to call our site pictures, videos or web files, will waste our server resources, through the study of the log, you can quickly find this problem.

Can initially analyze whether the website is hacked into the program

If the hacker uses some bugs of the website program, through the embedded code to carry on the attack to crack, through the log analysis may find this kind of trace.

Can initially analyze whether there are programs in a large number of crawl data

Search engine or third party Web site if the use of acquisition procedures, a large number of repeated collection of data on our website, will have a serious impact on server performance, and will let our data flow to other sites.

By analyzing the log data, we can find this collection phenomenon.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.