Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Today, we are very familiar with the things robot file, I believe that we are not unfamiliar with it! But I found that many webmaster friends forget the importance of this file, and in the settings when very casual. This understanding is a mistake. I have been on the side of the lectures while self-study, today encountered this problem; Then, I consulted a SEO teacher, I asked him: teacher this is important? At that time, the teacher said to me in earnest, robots can either achieve your site or destroy your site. So, it's important.
Let's get back to the conversation. Some information about robots in our website; in the SEO Research Center has a and I was a better friend, once, his site has included thousands, suddenly a afternoon included 0. At that time he panicked, because the site is not included, the flow is no, hard work for one months. In the end it was found that the bamboo basket was empty. At that time he was discouraged, ran to the group chat, everywhere to ask, my site was K, to solve Ah! Worried, then, there are a lot of people to help him busy; Check the reason, after a long time to check, we found that the reason is his website robots blocked Baidu Spider crawl Site page.
Why is this happening? The reason is that it is in the website when the robots are completely shielded Baidu spider crawl. Everybody knows a fact! is when you have written the rules, shielding Baidu spiders, you can find that Baidu will still crawl. The reason is very simple, Baidu can not be as intelligent as people. So you shield the spider and he will come crawling. But don't you panic! After a period of time, Baidu will have caught the page, but you have a shield file in the screen; he will delete from the index library, Baidu will have a discovery process, to the correction and error correction process.
At that time, I remember he said a word, is why to screen Baidu spider? Because the site has just done, and he himself also learned SEO, so also heard the SEO teacher said some of the content, understand that a new station can be temporarily shielded web site spiders; And the robots file repair, allow Baidu crawl; maybe he forgot to forget the key step. caused him to work the site and outside the chain, Baidu deleted the included pages, that period of time is very painful;
Today, I through this friend's experience is also hope that you can pay attention to the problem of robots? It's important, please don't overlook it; I also did not pay attention to the robots file this thing, feel indifferent ah! Then yesterday to listen to the teacher talked about this problem, I began to pay attention to, and then I began to put some very rubbish pages and pictures all to shield, the page is mainly a number of dynamic pages, we all know; Baidu like static; If your site is dynamic, it will cause the search engine to crawl many times, to the search engine impression and unfriendly. Wasting spider time and resources. Do you think he can be nice to you? There are also pictures of some online collection of pictures, because Baidu has some original pictures to crawl, if you use the site again; This will lead Baidu to think your site is plagiarism and false original. This will lead to Baidu does not include your site.
Said so much, my personal advice is to hope that the vast number of webmaster friends to pay attention to this problem! Because it is important, if you do not know how to make? You can go to my blog message, I will reply to answer! I hope you can do a good job of robots.
Article author: seo each say, Source: http://www.xuecnc.com/seo/post/0506.html
Welcome to join the SEO Exchange Group: 2,085,645,361 Share and Exchange SEO.
Copyright NOTICE: Reprint Please indicate the source of the article: Thank you for your cooperation.