Website optimization moderately shielded "spider" Beneficial harmless

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Website optimization do is to make the search engine quickly indexed pages, thereby increasing weight and traffic. This webmaster like spiders crawling over the site, eat a thoroughly. But lets the spider unbridled to crawl really is beneficial to the website optimization? Many webmaster in robotts.txt text may except admin, data directory limit spider crawl, other directory Spider index. I want to say, do site optimization In addition to some security directory should be appropriate to screen some of the directory, prohibit the search engine index is beneficial to the harmless.

One: The screen of the picture catalogue
Image is composed of the main elements of the site, but the theme template in the same picture in the search engine overwhelming, search engine over and over the index will also be included? Even if included, the effect is minimal. A good site, every day a variety of spiders add up to visit thousands of times, is undoubtedly a waste of a lot of bandwidth. In general, we often screen the directory as "images, images."

Second: Cache directory shielding
Many programs have a cache directory, take z-blog this program to give an example, its cache directory is "cache", which will have a lot of HTML generated cache files. If the spider continues to index, it is undoubtedly caused by the index of duplicate content, which is harmful to the site. See many use Z-blog friends are not set, may not be enough attention to it. Of course, other programs have this different cache directory, targeted shielding is most appropriate.

Three: Template directory shielding
For the program template, most webmasters have chosen to apply directly rather than independent templates. The high repeatability of the template also becomes the redundancy of information in a search engine, so it is beneficial to use Robots.txt to mask the template. And the template file is often similar to the height of the generated file, which makes the same content appear easily.

Four: CSS directory shielding
CSS directory, search engine crawl useless, so we set the Robots.txt file to screen it to improve the search engine index quality. Providing a concise and straightforward indexing environment for search engines is easier to improve web friendliness.

Five: Part of the program's RSS page
This article only for the existence of the RSS page program, the general blog is more common. This page can be said to be a highly repetitive page, it is absolutely necessary to screen to enhance the site in the search engine friendly.

Six: Screen the content of two pages
A Web site that can generate static pages, which are generally accessible to dynamic pages. such as: "Www.XXXX/1.html and Www.xxxxx/asp?id=1", both if the search engine is fully indexed, is undoubtedly access to the exact content of the search engine friendliness is harmful. And we are often shielding the latter, because the former page is static, more easily indexed by search engines.

Seven: The Program Security page
At the beginning of the location we explained the Shield Admin, the data directory, to prevent the security directory and database leakage. In addition, the database directory, Web site log directory, backup directory are required to be screened, can effectively reduce the "leakage" phenomenon.

Eight: Shielding file Form
Some webmaster always like to backup the site back to the site after the download forgot to delete, or do not download, directly to the server. However, we all know that the development of the site will be some people's peep, the site has been tried again and again in the database, backup files, a little bit of ranking sites are subject to similar attacks. The use of robots.txt shielding similar "rar, zip" and other similar files. In essence, this kind of shielding method is more beneficial to the "7th", but it is still only partial.

Summary: Moderately do screen spider optimization can not only save our server resources, but also can increase the search engine on the site's friendliness, why not? Binary Network Join Hands Professional website construction company Pilotage Technology (www.joyweb.net.cn) that: robots.txt file is not only a tool to screen the security directory, optimized to better facilitate the site's search engine optimization.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.