On common mistakes in SEO

Source: Internet
Author: User

Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall

1. Using the wrong target keyword

This is a lot of people are easy to make mistakes, people often take for granted to choose their favorite keywords, but users do not use them when searching. For example, you choose "Apple cider" as a keyword for your site, but it doesn't make sense to optimize this keyword to rank number one, because nobody searches for them. So it's important to choose the right keywords.

You can use some keyword advice tools to help you find the right keywords.

2. Use Flash

Flash technology to the internet has brought a richer user experience, increased the fun of demo and virtual travel, make your site more attractive, but search spiders can not index the content of flash, with the progress of technology, currently Googlebot can read the text and links in the Flash file, But the structure and element correlation of flash are not recognized. In addition, some text content in the Flash is stored graphically, and currently Googlebot have no eyes to look at these images, the text of the important words are completely ignored. All this means that Googlebot will ignore some of your text, content, or links, even if your Flash content is indexed in Googlebot. Worse still, other search engines ' reptiles have a worse ability to recognize flash than Googlebot. This shows that when you make some important content flash animation, these important content may not be identified by Google and other search engines, lost the opportunity to get the best rankings.

Search spiders can not index the contents of flash movies, so if your site has to use flash, do not forget to add some relevant text description.

3. JavaScript navigation Menu
The spider program cannot follow the links in the JavaScript navigation menu. The solution is to rewrite the links in normal HTML (or use the <noscript> tags) or provide alternative channels (such as adding links to the pages on the site map) so that the spider can access the pages.

4. Ignoring title titles title tag

Title title in the search engine ranking algorithm is very important, the title for search engines, News seeds (RSS) and other external environment to understand your page content is very important. A good title lets search engines find you and bring you more traffic. If the writing is not correct not only do not have the due effect, but also seriously affect the page rankings and included.

5. Excessive attention to META tags

Many people seem to think that search engine optimization just put keywords and description tags on the line. In fact, the status of META tags is becoming less important. You don't think you can get a good place for your site just by optimizing your meta keywords and descriptions. More seriously, many people simply do not optimize the META tags, their so-called optimization, is the meta tags in a large number of overlapping keywords, this approach to the site is not helpful to the rankings, but also may be found by the search engine to cheat and make the site punished.

6. Junk Reverse link spam

We all know that the number of backlinks is a very important factor in search engine rankings. So there are a lot of illegal webmaster use this principle, they have created a lot of useless links, such as to blog and message board to make spam comments, leaving their site links. Now the search engine has taken measures to this practice, if a short period of time to increase a large number of backlinks, it is possible to be Google into Google Sandbox (Sandbox),-will not affect the inclusion, but no rankings.

7. Lack of keywords in the content

Once you have identified the site's target keywords, you should revise the content and put the keyword in the right place. It's a good thing to add bold or highlight keywords.

In general, in most search engines, keyword density in the 2%~8% is a more appropriate range, conducive to the site in search engine rankings.

8. Use picture text and picture links

The text of the picture is much richer than the plain text in form, and the vision is much more beautiful. So a lot of websites in order to pursue the good visual effect, on the page a large number of pictures form the exquisite text, not only affect the download speed of the Web page, and the search engine can not identify the content and text on the picture, so that the search engine can not completely crawl the content of the page, thus affecting the site in the search results in the probability of appearing, Especially if the page title (H1 tag) and link use picture, the negative impact on the site is greater, because we all know, tags and links contain keywords and content contains the right amount of keywords, is the page ranking important factor.

9. Specification URL problem

In the eyes of search engines, the following two links are different:

Www.bizhi5.com

Bizhi5.com

Although we all know that they actually specify the same page, we can also imagine that the search engine will know it right away. However, now most search engines have a problem, the industry will call it "canonical URL problem" ("specification canonical" is a programmer term, "standard", so the specification URL is the site's standard URL or preferred URL), which will give SEO (Search engine optimization The results of the promotion have a serious negative impact: If your external links are distributed in different versions of the URL, then the role of these external links will be weakened, will seriously affect the ranking of the site. In addition, if the site is a different version of the Web site is included in the search engine database, will also create a copy of the content page (two or more pages of the same content or most of the same). Search engines often see copying content pages as cheating, and even if they are not seen as cheating, search engines often reduce the number of replicated pages.

10. Use JavaScript redirection and Meta refresh redirection

Using Meta Refresh redirection is easily penalized by search engines as misleading readers.

For redirects that use JavaScript technology, search engines cannot parse JavaScript, so search engines cannot detect (automatically detect) automatic redirection with JavaScript scripts. So the technique is often used by searchers who search for spiders to index old pages and let searchers go to the new URL page (the contents of the old and new pages are completely different). This redirection does not allow the spider program to crawl to your new URL, and they won't make your new URL included.

If you are sure you want to use redirection, it is best to use 301 permanent redirects. In many redirects, 301 permanent redirects are one of the safest ways to do this and are ideal for a solution.

11. The Web page is similar to the high.

Page similarity refers to the content of the Web page similarity, when the similarity of two pages in more than 80%, it is likely to be search engines, especially Google as a copy of the page or copied pages, which is not included, or even down the right, delete. Search engines will not only compare the content of your site with the content of other sites, but also compare the content of your site to other pages in your site to see if there are duplicates or similar content.

We don't know how much the search engine is punishing similar pages, and each search engine has a different standard for similarity penalties, so we should keep the site as low as possible.

This situation is particularly easy to happen in the site content of the detailed page, if less text, more pictures, it is easy to create a similar page between the high.

The following figure, for a classified website Merchant Information Detail page, although its content occupies a large layout, but its pictures mainly, and the text is very small, with the surrounding navigation, sidebar information, etc. compared to its use of the code estimates accounted for about 10% of the entire page. If this category information all Merchant Information Detail page is so (the picture is mainly, the text introduces less), then will cause these merchant information the page similarity degree is very high. The fact is also true, through the page Similarity detection Tool, random query 2 pages, the page similarity of 90%, far higher than the search engine can withstand less than 80%.

Lack of consistency and continuity in SEO work

Many people often think that once the site has been optimized, there is no need to do any optimization work. If you want your site to be successful, you must always focus on the search engine ranking algorithm changes to your competitors.

Search engine optimization is not a permanent job, it must be a daily routine. Reprint please keep this article website http://www.dyduo.com Source! Support admin5! Thank you!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.