I have been emphasizing the details of the optimization, is now the site of Baidu's requirements is to see your details do well, code, tags and so on have the details, then the robots are also part of the site details, do a good job he has a great
Robots.txt GuideWhen a search engine accesses a Web site, it first checks to see if there is a plain text file called robots.txt under the root domain of the site. The Robots.txt file is used to limit the search engine's access to its Web site,
About the syntax and function of robots. txt
As we know, search engines all have their own "Search robots" and use these robots to link on the Web page over the network (generally HTTP and SRC links) constantly crawl data to build your own database.
robots.txt files, more or less than friends have heard, may have written their own. In fact, so far I have not written robots.txt file, not not to write, just feel that there is nothing in the blog to stop Spiders crawl. And presumably everyone
ArticleDirectory
What do you want to do?
Use the robots.txt file to intercept or delete Web page Printing
The robots.txt file restricts the access to your website by the web-crawling search engine. These roaming bots are automatic.
1. More friendly display of the currently mounted file system
Mount | column-t
This command applies to any file system, and column is used to format the output as a list, and the main purpose here is to familiarize you with the usage of
This series of directories
If different links direct to a page with a large number of identical content, this phenomenon will be called "duplicate content". If a website has a large number of duplicate content, search engines will think that
Catalog of this series Different links to the page if there are a lot of the same content, this phenomenon will be called "duplicate content", if a site is a lot of duplicate content, the search engine will think that the value of the site is not
Apache logs do not record Image File Settings
Setenv imag 1Customlog logs/access_log combined Env =! Imag
Apache logs: If you record all access requests, the file will be large. If you use log analysis software, the analysis results may not be
ApacheHTTP server usage -------- log file-Linux Enterprise Application-Linux server application information. The following is a detailed description. To effectively manage Web servers, it is necessary to report the activity, performance, and
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.