Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Hello everyone, I am a beginner seo, today in class to learn the SEO novice must know the five disadvantage optimization factors, now want to put their own things to learn to share under the teacher lectures content, if there are bad places also hope everyone forgive me. Here are some of the five factors that you've shared:
1th: The site does not appear too many dead links
What is a dead link? Dead links are links that are not available. Too many dead links will lead to our site in the search engine impression, reduce the search engine on the site recognition and scoring. So, when the site appears dead link, we can use robots.txt to screen spiders crawl, or take 404 page prompts to users, to increase the weight of the site and bounce rate.
2nd: The website is open, not only for members to see the page
As shown in the figure above, some sites have to register or become members to browse the Web page, such practices will lead to a reduction in user viscosity, on the other hand hinder the search engine crawl page speed and effect. At the same time will also lead to a bad page, plate lack of interactivity. At present, the main register with micro-BO or QQ can login page, the topic of participation, flexible and convenient for users to spend a few seconds on the site to participate in interactive.
3rd: JS code as far as possible embedded in the file
Because spiders are hard to crawl JS code, in the site optimization is best to use less or no JS code, this will help spiders better and faster crawl your website information. If the site wants to use JS to do special effects, you can put the JS file alone in a folder call to reduce the size of the page. The page page is small, the natural access speed is accelerated.
4th: Flash Animation good-looking
The spider is still not very good for Flash animation recognition, if the enterprise in order to promote the use of Flash website, it is recommended to create a blog after the Flash program, to increase the content of the site update and collection rate, in order to get the weight and ranking under the search engine. The current mainstream blog program personally feel that doing a good blog program is z-blog and WordPress is very convenient to build the site, and easy to master.
5th: Do not use dynamic URL
URL is divided into dynamic, pseudo static and static three kinds, spiders are not easy to crawl dynamic, and the other two are spiders favorite, so under normal circumstances we do the site when the best use pseudo static or static. For example, WordPress program, directly in the custom link to write%post_id%.html, pseudo static is done, but also can use robots.txt protocol disallow:*? To screen search engines to crawl dynamic pages, reduce the search engine repeat crawl.
The above five points is I learned today's unfavorable optimization factors, I hope those and my friends can be in the optimization process to avoid these problems, and then get faster, more efficient optimization results and learning experience. If there is a friend interested in optimization, you can log in to the centrifugal pump (www.gaodunbf.com) and I exchange. Reprint please attach A5 original link address, thank you to retain the copyright.