Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Robots.txt is a very simple TXT file, you can tell the search engine which pages can be included, which is not allowed to be included.
1. How to look at the robots.txt of your own website: You can use FTP to see your site's root directory, there is this file, sent to the local can see, two you can be in the domain name of their own site after adding/robots.txt can be seen, if not open, that this document is not properly placed. Such as: Www.joy0574.com/robots.txt can see.
2. Main points: In general, there are only two functions written in robots.txt: User and Disallow. User: * followed by an asterisk to allow all search engines included, there are several prohibitions, there must be several Disallow functions, each line to describe; at least one Disallow function, if it is allowed to be included, then write: Disallow: If not allowed, write :D Isallow:/(Note: The difference is only one oblique rod).
3. It must be named robots.txt, otherwise it will be done in vain and must be in lowercase English.
4. In addition, if your site allows all search engines to be public, do not make this file or robots.txt is empty.
See a lot of people here may say, it is best not to do, what do you care? You must understand that the first step to do SEO is to fix robots.txt, if this is a problem, on your site included is extremely unfavorable. So, when you submit your site to a search engine, you must first check this, a few minutes, don't be afraid of trouble.
In addition, written in the end, robots.txt one of the biggest role: Baidu is not like Google Web site management tools can submit Googel Sitemaps, You can put your site map address in the robots.txt anywhere, that is indirectly to Baidu submitted your site map, why not?
The author of this article: www.joy0574.com