Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
How do I create a robots.txt file?
You can create this file in any text editor. The file should be an ASCII-encoded text file, not an HTML file. File names should use lowercase letters.
The simplest robots.txt files use two rules:
(1) User: Bots that apply to the following rules
(2) Disallow: pages to block
These two lines are considered an entry in the file. You can include as many entries as you want. You can include multiple Disallow rows and multiple
user in an entry. What should
be listed in the user line? The
user is a specific search engine rover. The Network rover database lists a number of common bots. You can set the entry
(by column name) or set to apply to all bots (by listing asterisks) that apply to a particular rover. Entries applied to all bots should resemble the following entries:
user-agent:*
Google uses a variety of different bots (user agents). The rover for 11510.html "> network search is Googlebot. Other bots, such as Googlebot-mobile and googlebot-
Image, follow the rules that you set for Googlebot, and you can also set additional rules for those specific bots. What should
be listed in the Disallow line? The
Disallow the page you want to intercept. You can list specific URLs or URL patterns. Entries should begin with a forward slash (/).
to block the entire Web site, use a forward tilt.
disallow:/
to block the directory and all content in it, add a forward tilt after the directory name.
disallow:/private_directory/
to block Web pages, list the page. The
disallow:/private_file.html
URLs are case-sensitive. For example, disallow:/index.htm blocks http://www.gz-kongtiao.cn/index.htm, but does not
intercept http://www.gz-kongtiao.cn/index.htm.
above is my view of the relevant information summed up, I hope to be useful!