Talking about SEO optimization is not to think of spiders as emperors

Source: Internet
Author: User
Keywords SEO the emperor talking the Great

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Most webmasters know that the site can be in search engines to get a very good ranking is very important, so many owners do their best to please search engine, the search engine of the spider as the emperor to see, hoping to gain the recognition of spiders, so as to enhance the ranking of the site, but in fact, even if the spider served well, Also not be able to get good rankings, this is why? Because spiders do not have the human emotion, the instant you regard it as the emperor He also has no compassion to you, how to do, therefore in the website optimization aspect, is not to the spider better optimization effect is better, but must know the choice! To learn how to block some spider skills! For example, in addition to restricting spiders in the admin and data on the crawl, but also in other directories appropriate shielding spiders, but also very good, the following to analyze the skills of several shielding spiders!

One: The picture and the template catalogue can be screened

Because at present many webmasters are looking for the same picture on the Internet and apply ready-made template, these templates and pictures have been rampant in the Internet, at this time you also let your site by spiders crawling these old things, naturally will make spiders very disgusted, so that your site affixed to imitate cheating on the label, Trying to get the search engine's favor is more difficult, so the images directory can usually be blocked!

Second: Cache directory can be blocked to prevent duplicate indexing

The spider is very greedy, as long as you feed him, it is true or unreal all take the order, for example, the contents of the site's cache directory, spiders will also be indexed, which is bound to and the contents of the site has been repeated, if there are too many repetitions, then the algorithm mechanism of Baidu will think your site cheating, Thus even enhance the weight of your site, to bring great impact on the site, usually the cache directory for each type of site is not the same, to be based on different construction procedures to screen the corresponding cache directory is very necessary!

Three: CSS directory and part of the RSS page needs to be blocked

CSS directory for spiders is completely useless, crawl will affect the search engine algorithm to judge, So you can through the ROBOTS.TXT file to screen, and many of the site's RSS page is also a duplication of content, crawling will also cause the search engine error judgment, both aspects of the content need to be shielded! This kind of shielding looks like a big disrespect to spiders, but actually like bitter! Good!

Four: If there are two pages, then the Priority screen dynamic page

In general, the site's static page is very easy to be indexed by search engines, usually spiders crawl and included is two different things, the site in addition to static pages, most of the site still exist dynamic pages, such as "www." Xxxx/1.html and www.xxxxx/asp?id=1 These two refers to the same page, if not shielded, it is bound to cause two pages are spiders crawl, but to the search engine algorithm to judge, because found two of the same page, Will think that your site is suspected of cheating, so increase your site's investigation efforts, thus affecting the ranking of the site, so the right approach is to screen the dynamic page of the site!

Five: The content that involves the website security privacy wants to block

The beginning of this article mentions that the admin and data directories are actually about Web site security privacy content, exposure to spiders is not the slightest benefit, and may even be more than one of the channels to be attacked, so involved in the security of the directory, such as the database directory, Web site log directory and backup directory need to be screened, In addition, some webmaster in the site after the backup, and then download, but after downloading there is no attention to the deletion of backup files, which is also easy to cause spiders to repeatedly crawl, but also easy to be hacker attacks, so the use of ROBOTS.TXT files to shield similar "rar and zip" file is also very necessary! At the very least to enhance the security of the site!

In a word, if the spider blindly as the emperor, is often a horse fart pat on the tail, through the appropriate optimization shielding, solve the spider's work pressure, this is the biggest horse fart, is also to improve the level of the site optimization of the road! This article source: http://www.wowawowa.cn/i wow i wow weight loss net A5 starting, reproduced please specify, thank you!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.