You can create the robots.txt file under the website root directory to guide the search engine to include websites. Googlespider googlebotbaiduspider baiduspidermsnspider msnbotrobots.txt the writing syntax allows all robots to access User-agent: * Disallow: Or User-agent: * Allow: Or you can create an empty
In the root directory of the website, you can also create the robots.txt file to guide the search engine to include the website. Google spider GoogleBot BaiDu spider baiduspmsn spider MSNBOT robots.txt writing syntax allows all robots to access User-agent: * Disallow: Or User-agent: * Allow: Or you can create an empty
You can create the robots.txt file under the website root directory to guide the search engine to include websites.
Google spider GoogleBot
BaiDu spider baidusp
MSN spider MSNBOT
Robots.txtWriting syntax
Allow access by all robots
User-agent :*
Disallow:
Or
User-agent :*
Allow:
Alternatively, you can create an empty file "/robots.txt" file
Prohibit all search engines from accessing any part of the website
User-agent :*
Disallow :/
Prohibit all search engines from accessing the website (in the following example, the 01, 02, and 03 Directories)
User-agent :*
Disallow:/01/
Disallow:/02/
Disallow:/03/
Prohibit Access to a search engine (BadBot in the following example)
User-agent: BadBot
Disallow :/
Only access to a search engine is allowed (The Crawler in the following example)
User-agent: Crawler
Disallow:
User-agent :*
Disallow :/