Communicate the role of robots and common problems

Source: Internet
Author: User
Keywords SEO

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

First of all here to describe the role of robots, it can be used to do. Robots are a protocol between Web sites and search engines. Used to prevent search engines from crawling content that we don't want to be indexed by search engines. We can use it to protect some privacy content, shielding dead links and no content pages and duplicate pages.

This agreement is valid for all search engines, it is not a command, it is an instruction, it will not take effect immediately, a short few days can be effective, longer than 4 weeks to take effect. Google relative to Baidu, the effective time more quickly; Common syntax definition: User: Define search engines. If you want to define all search engines please use *;D isallow: Prohibit search engine crawl, "/" to represent the root directory, representing all directories of the site. Allow: Is the meaning of permission, here only a simple description, the specific operation of the wording please refer to Baidu Library; Let's talk about common problems and usage.

One, shielding dead links

Dead link is the site can not avoid the problem, if the search engine included, such a dead link will inevitably affect the user experience, this time the page needs to be a screen for robots. Several common errors are as follows:

1, site error, the operator of the website has been mistakenly operated or deleted some specific pages.

2, the procedure error, because the website revision, the replacement procedure causes the dead link.

3, the chain caused by the dead link, usually due to input errors caused by the formation of your site's dead link.

Second, screen no content pages and repeat pages

1, no content pages and duplicate pages will seriously affect the quality of the site's pages, if there is a large number of this meaningless and repeated pages will cause the site to fall right; for example, some registration page, landing page, session ID, load page, shopping mall of the Blue page, these should be screened.

2, shielding the same page of multiple paths, many of the homepage of the site will exist such a problem, for example, the Dangyang hotline: www.***.net, this web site is commonly used to log in, because the program and path is not unified cause, resulting in the first page has two paths are included, www.***.net/ index.php, similar to this, we will give the website unified a standard entrance, directly with a robots for shielding operation.

SEO is an accumulation of the process, the above points, I hope that the novice webmaster friends to optimize the site to help, there is no detailed description of the specific robots to write, the specific wording can be viewed Baidu Robots Optimization Guide; here only to its role and use of the description, hoping to check the gaps. Thank Www.444100.net Webmaster Contributions!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.