How to Write robot.txt

Source: Internet
Author: User
At the moment, website administrators seem to have not paid much attention to robots.txt, but some functions cannot be removed from it. So today, Shijiazhuang Seo wants to pass this article Article Let's simply talk about writing robots.txt .? Or the specified search engine only contains the specified content. When a search robot (called a search spider) is a site, robots.txtbasically introduces robots.txt as a plain text file. In this file, the website administrator can declare that the website does not want to be accessed by robots, or specify that the search engine only contains the specified content. When a search robot (called a search spider) crawls a site, it first checks that the site root directory contains robots.txt. If so, the search robot determines the access range based on the content in the file. If the file does not exist, the search robot crawls the link. In addition, robots.txt must be placed in the root directory of a site, and all file names must be in lowercase. Example: robot robots.txt file from http://www.shijiazhuangseo.com.cn # All robots will spider the domainuser-AGENT: * disallow: The above text represents allowing all search robots to access all files under the site www.shijiazhuangseo.com.cn. Specific syntax analysis: The # text is the description information, the User-Agent is the name of the search robot, and the * text is the name of all search robots. disallow: the following is the file directory that cannot be accessed. Next, let me list the specific usage of robots.txt: Allow all robots to access User-Agent: * disallow: alternatively, you can create an empty file "/robots.txt" file to prohibit all search engines from accessing any part of the website. User-Agent: * disallow: /prohibit all search engines from accessing several parts of the website (01, 02, and 03 directories in the following example) User-Agent: * disallow:/01/disallow:/02/disallow: /03/prohibit access to a search engine (badbot in the following example) User-Agent: badbotdisallow:/only allow access to a search engine (crawler in the following example) User-Agent: crawlerdisallow: User-Agent: * disallow:/In addition, I think it is necessary to expand the description and introduce robots meta. The robots meta tag is mainly used Page by page. Like other meta tags (such as the language used, page description, and keywords), robots meta tags are also placed in the

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.