Whether the implementation of the standards of the consortium is beneficial to search engine optimization

Source: Internet
Author: User
Keywords Search engines whether executed we

The World Wide Web is the acronym for the English-Wide website Consortium, which is the www. The consortium was founded in October 1994 at the Massachusetts Institute of Computer Science Laboratory. The founder is the inventor of the World Wide Web, Tim Berners.

The Consortium is a non-profit organization that is designed for web standards, such as HTML, XHTML, CSS, and XML, which are customized by the consortium. Members of the consortium (about 500 members) include manufacturers of production technology products and services, content suppliers, group users, research laboratories, standard-setting bodies, and government departments, working together to reach consensus on the World Wide Web development direction.

Since the advent of the web, every step in the development of the web, technology maturity and the expansion of the field of application, can not be separated from the October 1994, the World Wide Web Consortium, www alliance efforts. The consortium is an international organization dedicated to creating Web-related technical standards and promoting the web to deeper and wider development.

For some time, there has been a lot of people thinking: follow the Web standards. Unfortunately, the answer to this question is not so clear. To get the answer, we need to know how the search engine "thinks".

When evaluating page rankings. Search engines look at a set of quality signals that may provide clues as to what the page is about and what the content is worth. One of these quality signals that we can identify is how many reputable inbound link sources there are. If there are many good site links, it may have high quality content, so when the relevant search results are displayed, it should be ranked top. This quality signal makes Google the boss of the search engine industry, while other search algorithms are based on unreliable quality signals. Obviously, in the billions of dollar search industry, choosing the right quality signal is the most important.

So if we know that the link is a quality signal, it means we should be able to know the relationship between Web standards and SEO. But there is one more question to get the answer. Because search engine companies rely on unique algorithms to survive, so they are very focused on protecting the privacy of ranking. They are unwilling to disclose trade secrets to rivals or those who try to sabotage the system.

How high is the search engine's assessment of standard compatibility? In foreign SEO industry, some simple but interesting experiments have been done to provide some ideas for this problem. He created a series of pages about "Lodefizzle", which he coined, and then labeled them in different ways. By creating a meaningless word, it will only be used on its own page to get a clearer result of the key quality signals that determine search engine rankings.

The findings confirm some general assumptions, but surprisingly, they also overturn other assumptions. He found some results, as follows:

1. Semantic markers such as

The tag keyword does improve the page rank and is better than other methods (such as) to mark the same content.

2. The use of complex nested tables in the layout is, in fact, damaging to some extent the search rankings-presumably because the content of the code is rising, making the page look less valuable.

3. Invalid code, will greatly affect the search rankings. In some cases, it even prevents the page from appearing in the list!

4. Semantically correct tagging does help improve search rankings, but sometimes other technologies are more effective. such as the occurrence of a number of natural keywords in the text (surrounded by other words around). Use keywords to name files, and to place keywords on the label of an internal connection. and put keywords on the title tag.

5. There are also certain circumstances in which page rankings that do not follow web standards are ranked higher than those following standards.

From this study, we learned that Web standards promote the right approach, enabling search engines to successfully interpret and rank in a meaningful way, making pages more likely to get a higher ranking and more easily found. However, the fact that the page verifies is not necessarily the important quality signal of the main search engine. To further complicate the problem, search engine algorithms often change to accommodate new SEO methods, including white hat (honest) and black hat (dishonest) technology. Testing your site in an experimental way may be the best way to keep up with the changes in the main search engine algorithm.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.