Spider traps seoer have to know

Source: Internet
Author: User

Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall

When it comes to spider traps, probably a lot of SEO novice have not heard, what is a spider? Spider traps is to prevent spiders crawling site obstacles or interference, a lot of spider traps are man-made, there are a lot of careless caused, since it is spider traps that is the trap for spiders, the user is not affected, So maybe the Web interface looks pretty normal, but there's probably a spider trap inside. If you eliminate these spider traps, you can make spiders better crawl within the site crawl, now look at some seoer have to know the spider traps!

  

1. Flash Animation

Some friends will ask, why Flash is a spider trap, in fact, this problem is very simple, because spiders can not identify flash, many sites like to do a flash animation on the homepage and then jump to the HTML page, users can visit the site after the title was turned to the real HTML page. But what about search engine spiders? Spiders are not tracked through this flash to HTML version pages. This is a trap that blocks spiders from continuing to crawl sites. If the flash effect is required, wood-wood SEO recommends adding a link to the HTML version of the home Flash file, so spiders can track this link to crawl behind HTML pages.

2. JS Link

JavaScript creates a lot of compelling visuals, and some sites like to use JavaScript scripts to generate navigation systems. This is one of the more serious spider traps. Spiders can not recognize JS, for JS special effects can be added to the site, you can also not need to be crawled into the link into JS, but the need to crawl the link must not be placed in JS, this will cause spiders can not continue to crawl, at the same time JS also not too much, you can specifically build a JS file.

3. Dynamic URL

We say now spiders can crawl part of the URL link, but the dynamic URL is directly generated by the database, some with a question mark, equal sign and many other parameters. This kind of web site not only is not conducive to search engine spiders crawling, but also may cause the spider's death cycle, this is a deadly spider trap. So most of the situation is to recommend static website.

4. Session ID

Some sites use the session ID to track user access, and each user will generate a unique session ID, or ID, when they visit the site. This ID is added to the URL, search engine spiders every visit will be treated as a new user in the URL will be added a different ID, so that every time the spider to access the same page will be the same URL. The result is obvious, resulting in a duplicate content page that results in a highly repetitive content page.

5. Frame Framework

Frame structure for spiders and flash, JS as is a wall, frame structure will hinder search engine spiders crawling and crawling, let spiders fall into traps.

6. Require login

This is very common, some site content on the need for users to see after the login can be seen, set up some of the entire site can only login to see, which for users of course nothing, and for spiders cause traps, why? Because spiders can not fill in the username, password, and will not register, naturally, Spiders can not come in to crawl your content, so unless it is a VIP content or do not need to crawl the content of spiders, the other should be open to spiders, so as to crawl.

7. Compulsory use of cookies

Some sites in order to achieve a certain function, such as the memory of user login information, tracking user access path, and so on, forcing users to use cookies, user browser if not enabled cookies, the page display is not normal. Search engine spider is equivalent to a disable cookies browser, forced to use cookies can only cause search engine spiders do not normally access

8. Various jumps

At present the comparison proposal website jumps is 301, 302 redirects, but other jumps, the spider is very sensitive. Because Black hat likes to use jump this kind of method, black other people's website, then hang on someone else's website to point to oneself website jump. Like JS jump, code jump, etc. are not recommended to use, especially if you do 404 pages must use code to jump, the proposal time in more than 5 seconds.

9. Website link structure

For spiders, our main point is the link structure of the site, in fact, the link structure of the site is equal to a house, link is the door, and the spider is our invitation to friends, the site link structure Slot cake Chaos is the equivalent of a friend came to your home after the basic points are not clear, can not find the import and export. Even the corresponding links are not found, how do spiders crawl? So the link structure still needs to be flat tree structure.

Know these possible spider traps, we should pay attention to not to the spider trap, then now go to your site often, see if there are these spider traps. Welcome to Sina Weibo: wood-wood SEO Blog

Author: wood-Wood seo http://blog.sina.com.cn/s/blog_c206a2c30101g7wx.html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.