Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Robot file for you are not unfamiliar with the webmaster, we generally use this file to obstruct the search engine spiders crawl Some we do not want it to crawl the page. But in fact, the use of this file can also let search engine spiders for our site crawling more diligent. How to operate in detail, the author will do a brief introduction.
One: Understanding Robot Files
Webmaster should be very clear about the role of the file is to tell the search engine spider program on the site what files can be viewed crawled, what content is prohibited, when the search engine spider through this page to understand, you can put all its attention on the page we allow crawling, Thus the limited weights are concentrated.
At the same time, we can not ignore the key point, that is, our robot file will be all search engines on our site to see the first file. To this I think we can use robot file to let search engine spiders crawl our site when more diligent.
Second: Write the site map address
I said above, search engine spiders Enter our site will first visit to make our robot files. Based on this, if we write our sitemap file (sitemap.xml) address to our robot in the Allow crawl page, naturally our sitemap files can be better and faster by the search engine spiders get. After the author's attempt, this method for small site effect is very obvious, for large sites have a good positive effect.
The specific method we can look at the following picture
To this I think whether your site is the new station or the old station, is large or small and medium-sized sites, for you to the site map files written to robot files are very good results.
Third, the site map file acquisition
The author said so much, maybe some novice do not understand how to get the site map files, so I share a access to the site map files of gadgets.
This is a very small tool, but also is Google's recommended site map generation tool. The address of this tool is: http://www.xml-sitemaps.com/, as shown in the following figure, when we enter the tool, we can choose the site map for our site according to the options provided above. Once the site map is generated, we can just put the file on the root of the site.
To sum up, I think that the site map file to write robot file to improve the search engine on our site crawling way is feasible, and the operation of the difficulty coefficient is not small. I hope this article for everyone in the improvement of the page crawl degree of help. This article by mobile phone QQ download http://kukud.net/webmaster original, reprint please reserve address, thank you.