Z-blog is a small and powerful blog based on ASP platform program, its features include:
Support interface theme and style replacement
Design standards for web standards
Static build log, support custom directory configuration
Support Fierfox, http://www.aliyun.com/zixun/aggregation/10963.html ">opera, Safari and other browsers
Support WAP, off-line writing software
My personal station is built with Z-blog station, recently I found a lot of z-blog webmaster do not know how to set up their own robots, in fact, the design of the Web site is still very important. Create a robots.txt file that doesn't allow search spiders to crawl portions of your site's directories or files. The more directories and files that are not crawled by a Web site, the better, because not all files need to be crawled by search engines.
There are plenty of reasons to tell us the importance of a set of robots:
Some files have no keyword or text at all, such as this page of this site cmd.asp.
In general, the site search results page with a set of robots to screen out better, such as the sreach.asp of the site
Web Site Admin page, do not want to be user search to attract unnecessary trouble, but also need to use a screen set
Website revision or URL rewrite when the original does not conform to search engine friendly links need to use the robots to screen out
The similarity is too high page, does not conform to the search engine "the website Similarity rule", also may use the robots setting to shield off
Below is my personal station of the robots set rules, I hope to be able to play a role:
# robots.txt for Dikeyao
Version 4.0.0#user-agent: *
# directories
Disallow:/function/
Disallow: Cache /
Disallow:/xml-rpc/disallow:/script/
Disallow:/admin/
Disallow:/css/
Disallow:/language/
Disallow:/data/
Disallow:/themes/
Disallow:/include/
#Files
Disallow:/wap.asp
Disallow:/cmd.asp
Disallow:/c_option.asp
Disallow:/c_custom.asp
#Sitemap
Sitemap:http://www.dikeyao.cn/sitemap.xml