About the website robot file use detailed

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Many friends do not pay attention to the robots file, in fact, in the SEO optimization is more important, not only to use it to screen dead chain, but also to use it to guide spiders crawl site map. Refer to my http://www.seo0668.com/robots.txt, will not copy me on the line.

User: * Disallow:/plus/ad_js.phpdisallow:/plus/advancedsearch.phpdisallow:/plus/car.phpdisallow:/plus/ Carbuyaction.phpdisallow:/plus/shops_buyaction.phpdisallow:/plus/erraddsave.phpdisallow:/plus/ Posttocar.phpdisallow:/plus/disdls.phpdisallow:/plus/feedback_js.phpdisallow:/plus/mytag_js.phpdisallow:/plus/ Rss.phpdisallow:/plus/search.phpdisallow:/plus/recommend.phpdisallow:/plus/stow.phpdisallow:/plus/ Count.phpdisallow:/includedisallow:/templetsdisallow:/wl/disallow:/plus/disallow:/plus/guestbookDisallow:/plus /guestbook.phpdisallow:/mmseo/75.htmldisallow:/plus/arcmulti.php?mtype=0&pnum=disallow:/plus/arcmulti.php? Mtype=1&pnum=disallow:/plus/feedback_ajax.phpdisallow:/mmseo/xyseo/177.htmldisallow:/MMSEO/33.htmlSitemap: Http://www.seo0668.com/sitemap.htmSitemap:http://www.seo0668.com/sitemap.xml

Robot file Features:

Search engine Grab page will be called spiders (also called robot) to crawl our website, before crawling a site will visit the root directory of this site Robots.txt. And the contents of this Robots.txt is to tell spiders, which can crawl which is not to crawl, you can say Robots.txt is not negligible SEO small details. Can guide spiders right crawling our website This is the function of the robot, we can put some important sections or folders to screen out, so that spiders crawl not to protect our website security

The role of the robot file:

1. When you delete a directory, you need to use a robots file to prohibit spiders to this directory access, so as to avoid the formation of dead links. Of course, the search engine contains some of your site's image directory, JS directory ..., if the spider read your site useless information will also give you a little weight.

2, the language of robots seems to be simple and practical more stringent, suggest to go online to see the relevant information, even if a blank write wrong will also cause unnecessary consequences, suggested to the online learning.

3, the biggest use of robots is to submit site map, submit site map can let search engine know that you are updated every day, included in your article will be more and faster.

Robot File:

1, allow search engines to include all the content of this site: Robots.txt for the empty can be, do not write anything.

2, prohibit the search engine contains some pages of this site, reference http://www.seo0668.com/robots.txt.

User: *


Disallow:/directory name 1/


Disallow:/directory Name 2/

3, prohibit search engine included all content of this site:

We know that Taobao is blocked Baidu, I found Taobao Robots.txt, browser input: http://www.taobao.com/robots.txt, copy can be.

Google Spider: Googlebot
Baidu Spider: Baiduspider
Yahoo Spider: slurp

User-agent:baiduspider
Disallow:/
User-agent:baiduspider
Disallow:/

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.