How to use the robots file to enhance the weight of main pages

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

The robots file is the first file to read when the search engine visits the website, it tells the search program server which content can crawl, which not to crawl.

The current Web site of the robots in the crawl rules have been increasingly perfect, for example, the Web page to prohibit the capture of pictures, the spider is not allowed to crawl some members of the privacy (resume) page, some useless pages (previous promotion page), style sheet files, etc., but there are still some pages do not need spider program Crawl, This part of the page is only user-oriented, search engine crawl has no meaning, partition the weight of the page.

404 pages of the website

Every site has a number of error pages, 404 of pages exist in order to be able to access the wrong page when users can guide users to the correct page, to prevent the loss of site traffic. and a more page of the site must have many error pages, which leads to many similar 404 of pages, such as http://www.daochengrc.com/404.html,http://www.yongjiangrc.com/404.html,

As shown in the following illustration:

  

  

So, once 404 pages too much, but also let search engines to include, will lead to the site to the weight of these 404 pages, so should screen 404 pages.

Add rule: disallow:/404.html

Part of the website navigation page

Some of the navigation pages at the bottom of the site, such as "marketing cooperation", "website statement", "payment method" are all for the customer's page, few users search the search engine for these pages and reach the site, and these navigation pages are full station display, all content pages exist, the same also want to spread the weight of these pages.

This part of the page is located in the same directory/main, the directory, in addition to retaining some of the pages need spiders to crawl, other pages can be banned, keep the page as: "About Us" (main/aboutus.asp), links (main/friendlink.asp). In addition, "Tariff standard" "Payment Method" page is located in the Corporate Member Center page, these pages are not necessarily open to search engines.

To add a rule:

allow:/main/aboutus*

allow:/main/friendlink*

allow:/main/recruitmeeting*

allow:/main/investigation*

disallow:/main/

disallow:/company/companyapplymember*

Delete the last article in an existing rule: Disallow:/main/refuse*

Place the newly added two "Allow" rule in front of the disallow command.

Through some of the search engine crawl meaning of the page, will be the first page and some content pages weight more concentrated.

Copyright, Woo Jun Recruitment network http://www.51rc.com, reproduced please indicate the source.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.