Thoughts on the dedecms of dream-weaving files

Source: Internet
Author: User
Keywords Dedecms

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

With weaving Dream Dedecms Station webmaster will find, dedecms robots.txt file setup is very simple (developers based on the directory structure of each site and optimization purposes of different considerations), and can not fully meet the site's optimization requirements, that how to combine their own site situation, Do robots.txt optimization? Here are some ideas for your own purposes.

Applies to DEDECMS sites that are not pseudo static.

User: *

Disallow:/dede Admin Background directory, need to change the name, specific settings later specifically speaking

Disallow:/include program core file directory

Disallow:/member member Management directory, some files can be opened

Disallow:/plus plug-in and accessibility directory, some files can be opened, such as search.php

Disallow:/templets Default Template Storage directory

Disallow:/data system cache or other writable data storage directory

Disallow:/uploads upload download File save directory, do not want to search engines to quote pictures, prohibit

Disallow:/images system default template picture storage directory

Disallow:/index.php Web site default home page, static, it is best to prohibit

Disallow:/404.html

Allow: Specific documents in the/plus/search.php open ban catalogue

...

The basic setting is this, the following focus on the background Management directory and section page settings:

1.dede directory, to be renamed for site security considerations. However, after renaming, we can not help but wonder: changed the name, in the robots.txt file how to set it? directly prohibit crawling, or leaked the background directory, equal to renaming is invalid So how do we solve this problem? We can solve this problem through the following settings, such as the background directory for DEDECMS:

Disallow:/d*ms

In this way we can prohibit the search engine crawl, and will not disclose the background directory name.

2. Section page. Some people will notice that if the site does not do pseudo static optimization, the column page will have two links to the column home, such as */web/and */web/list_1_1.html, in order to optimize the site, it is recommended to optimize the column page below (the specific way you can find on the Internet), Change the link on the first page to the */web/form, and then make the following settings in the robots.txt file:

Disallow:/*1.html$

The above is dedecms robots.txt file setting, you can according to the situation of their own site specific settings.

Note:

1. Set the directory authority according to the official description;

2. Background directory changes after the name of the beginning and end letters do not like other directories;

3. Baidu Webmaster tools to test the robots.txt file settings are valid.

This article is edited by http://www.1886sj.com.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.