Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
JavaScript in SEO is a very difficult problem, on the one hand, we need to use JavaScript in the production of web pages to achieve brilliant effects, and on the one hand JavaScript will be the search engine crawl analysis caused a bad impact. Google's official documentation makes it clear that too much of the complexity of using JavaScript, cookies, session IDs, frames, DHTML, or Flash in HTML can cause search engine crawlers to experience problems crawling the site.
Use a-text browser such as Lynx to examine your site, because most search engine spiders and your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep to seeing all to your site in a text Browser, then search engine spiders may have trouble publicly site.
http://www.google.com/support/webmasters/bin/answer.py?answer=35769
Google Site Search isn ' t Inc. to index content contained in JavaScript. The general rule for making throaty so a Web page can be indexed by Google was to ensure, all of the text, that needs to be indexed is visible in a text-based browser, or a browser with JavaScript turned off.
http://www.google.com/support/customsearch/bin/answer.py?answer=72366
Not only did Google,yahoo in official documents also have similar emphasis:
Try to use the text information that the search engine can recognize to avoid the plethora of complex technologies such as JavaScript, cookies, frames, DHTML, or Flash.
Http://help.cn.yahoo.com/answerpage_2911.html
Although Baidu does not have a clear explanation, but after a lot of practice, Baidu is also unable to identify JavaScript.
This creates a problem, the Web page too much JavaScript code is undoubtedly on the Search Engine Analysis page content to increase the difficulty, if the link is also a lot of javascript composition, then the search engine can not even follow the link to crawl the page. In this way, excessive use of javascript creates the following effects:
1, the search engine analysis of Web content caused interference. 2, the impact of keyword density. 3, seriously hinder search engine crawl Web page. 4, the impact of the link generated by the page weight distribution, which is usually reflected in the PageRank.
The impact of crawling and linking weights can also be made up of outside the chain, but the first 2nd is not easy to save.
How to use JavaScript without affecting the effect of the Web page and search engine friendly, so that does not affect the SEO effect?
1, absolutely avoid navigation and other links using JavaScript. Navigation and link is the Search Engine crawl Web page for the survival of this, if the search engine can not crawl the page, it represents the page will not appear in the index results, it is impossible to talk about ranking.
2. Try to avoid using JavaScript for content. Especially with the relevant part of the keyword content, should try to avoid using JavaScript to show, otherwise there is no doubt to reduce the keyword density.
3, really need to use the JavaScript part, this part of the JavaScript script in one or several. js file, so as to avoid interference to the search engine crawl and analysis.
4, can not be placed in the. js file, some of the JavaScript script, put them at the bottom of the HTML code, before, so that the search engine analysis of the Web page will eventually find it, reduce the interference to the search engine.
Some of the above methods are to eliminate the negative effects of JavaScript on search engines. On the contrary, a thing usually has advantages and disadvantages. The use of JavaScript is the same, not necessarily using JavaScript must be bad, to a certain extent, using JavaScript, but SEO has a very good role, that is, a positive role.
11545.html "> We have said that search engines can't recognize JavaScript (although Google is now doing a little bit of simple JavaScript code, it should be just simple code like document write). So in another way, we can use JavaScript to filter some spam.
What is junk information? From the point of view of SEO is not only for search engine crawl analysis is useless, but also to the keyword density caused interference and other adverse information. Usually these "junk" information includes: ADS, copyright statements, a large number of export links, information unrelated to the content, and so on. We can throw all this rubbish into one or several. js files, thus reducing the material content of the page interference, improve keyword density, to the search engine to show the core content of the page.
If you are interested, next time I can share the whole Flash site SEO method.
Webmaster Network News list Www.admin5.com/top