SEO optimization How to deal with duplicate links

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

When we do site optimization we often encounter a variety of duplicate pages, there are many reasons for these, but most of them are due to the error of the program, so we choose a set of mature procedures when we do the station is necessary, which will make you save a lot of trouble when doing the station. Days we will combine my personal treatment of the sister acne net repeated experience of the page to talk about how to deal with duplicate pages.

301 Jump Repeat Link

Every site will inevitably experience structural or content adjustments. I do in the sister acne nets also experienced such a thing, because the original building this station, not very much attention to the optimization of the directory structure, leading to later users of the bad health, the search engine is not very good. So decided to change, and this change will need to the old URL into a new, but at the same time to ensure that the old URL can be accessed, if the processing is not good search will lead to 2 or more URLs to the same page, although to the user is not feel anything, but for the search engine is indeed a bad experience, So this time we need to do is to use 301 will be the URL to jump to the new URL, so that both the user experience, but also take care of the search engine, but also can be transferred to the URL weight to the new URL. So in the Web site revision time to pay attention to 301 jump processing.

Robots.txt Block off Duplicate links

Most of these problems are the result of a bug in the program itself, or a pseudo static one, for example, my sister acne Net forum, with the DZ Forum program, but the program itself will have a lot of duplicate URLs, so if the processing is not good that will cause a lot of duplicate pages, I handle the way, is the first use of the URL set to pseudo static , and then use robots.txt with "?" The page of the parameter, and the page ending with ". php". Of course, this can only avoid most of the duplication, if you want to refine, you also need to analyze the site log, see there are those duplicate URLs, and then blocked off.

Link Label to resolve duplicate

A lot of times we, some content page such as product page, only in the model or color of some differences, but itself refers to a commodity, this time need to standardize url,< link rel= "canonical" Href= "URL"/> But this label is only valid for Google, and Baidu does not support this tag, for Baidu we can only block this url< meta name= "Baiduspider" contect= "Noindex" >.

All of the above 3 ways to deal with repetition need the spirit and application, basically need to match the application to achieve the effect. Finally attached sister acne NET connection: www.jjmm5.com reprint please keep this connection.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.