Repeated meta tags are the culprits that led to the descent

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Every day there are fresh content to fill in, then the site is not always included in the total, why?

Seom all know, included how much of a site is very important, this data to a large extent to the optimization of the keyword rankings, then, today we will again pick up this old and important topic, in the end is what causes included on the not to go?

There are many opinions, but one of the most important and most easily overlooked is the repetition of "meta tags".

First I explain the next meta tag, the META tag that is the Web page description in keywords, is used to visually tell spider the focus and core of the Web page, Description and keywords reasonable collocation can tell spider you this page core content and want to highlight keywords, through this collocation can get the search engine better keywords rankings (of course, single keyword ranking optimization more than this). It is also because this description keywords is regarded as the focus of optimization by SEOM, but it is often easily overlooked https://www.google.com/webmasters/tools/dashboard?pli=1 GG Administrator Tools, I believe that we are not unfamiliar, this tool has a content analysis function, you can clearly detect the site of each page of the META tag problems.

We all know spider crawl a site is mainly through the description,keywords, such as the first initial understanding of this page, the next will go deep to crawl page details, We assume that you have 1 old pages and 10 new pages of the description in keywords is exactly the same, here the emphasis, spider in the crawling process will have a data record, if Spider crawl through your old page, And then crawling the 1th new page will find a page that has been crawling, why? Because your description is identical to keywords, We said Spider Crawl page is first through the description to keywords to understand the page, and in the crawling process will have a data record, exactly the same description keywords will give spider create the same page "illusion" So the 9 remaining new pages will also encounter this situation, so that after spider crawl your entire site, the data records will still have only 1 pages, and will not record your 10 new pages.

We must remember that spider is just a machine program, he can not have the same as human recognition and observation capabilities, so to a large extent, we have to take the initiative to tell, to show different information to it know, if because of their own carelessness, or lazy, resulting in the data collected stagnant, That's really wrong.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.