The experience of the file error caused by the new station does not include

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

SEO has been done for a year, feeling a bit small level, should be able to operate their own 1, 2 stations to see, so the late August, they did a Taobao, selling gadgets, by the way to practice practiced, we all know that Taobao and Baidu are online business, is a competitive relationship, two are certainly incompatible, Taobao does not bird Baidu, shielding spiders crawl, so I think and others like, their own build a station, and then in Baidu to promote, make ranking, for my Taobao store more attracted some traffic over. Want to do, the beginning of the September site is ready to online, the following is the site before the optimization process:

First of all I want to do is a warfare, so I am using the normal operation, the first use ROBOTS.TXT file to screen the Spider crawl, first of all, the site layout, the original some JS code and redundant garbage code deleted. Then write a good site title, confirm that there will not be suspected of piling up keywords, and then in each content plate 4, 5 original or false original articles, the site is filled, do not wait for the online search engine feel that this site has nothing to crawl. Site title, content, structure, layout are all done, on the website can be online.

I first change the ROBOTS.TXT back, allow search engines to crawl content, and then send 1, 2 original articles up, then go to the major search engines to submit the site landing, and then wait until the evening to A5 to attract spiders special sections, in the evening 11:00 to 00:30 of the time, the release of original Soft Wen, Add your own website links above to quickly attract spiders to crawl my site. Finally sit and wait for a collection, I think the most tomorrow, the day after the site will be included, the results of the accident, the third day did not include, feeling very strange, in this period, I have updated content and send some outside the chain, according to the truth should be included early. I downloaded the log on the FTP to see, found online that night 12 o'clock when the spider came, but crawled to robots.txt file to go, very puzzled, reckon should be robots.txt file where there is a mistake, open look, the following is to see robots.txt file some found:

Found that there is no problem with the robots.txt rules, and then look at the program generated files, and the same background settings, into the Webmaster View crawl, in the "Remove Tool permissions" column to see: The first line:? user-agent:*, and then the result is a syntax error. Obviously one more "?", inexplicably, really do not know what is the problem, no way, can only use the trick, with the simulation search engine to crawl down the site, to see exactly where the problem, sure enough, is due to the TXT file encoding incorrect, write the file using the UTF8 code, But the search engine looks like the utf-8 code of the robots does not understand correctly, rewrite the program to ANSI code after the problem solved finally solved.

After the evening to A5 to the spider, a smooth collection, a small detail led to my site late 4 days only included, if not found, estimates are even worse, so we do SEO to pay more attention to some small problems, details can not be improper, Especially ROBOTS.TXT file, is the spider to the site first crawl file, a mistake, there will be big problems, there are many people through the modification of others robots.txt files, to harm our website, usually pay more attention to see, but fortunately this easy to see, as long as the log out to see how long can roughly know the wrong, so we also have to develop a more read the habit of logging.

This article by Schindler SEO published in Fuzhou printing http://www.fzywzx.com, reprint please keep the connection!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.