Surprisingly winning website design to cope with Baidu's new strategy

Source: Internet
Author: User

Recently, Baidu's policy updates have prompted the Internet to "focus on content construction". webmasters began to discuss how to build websites, what kind of website design can feed spider appetite? I have recently studied how to develop an amazing website to cater to Baidu's new strategy. Here I will share my thoughts with you.

First, friendly navigation settings. When building a website, you must consider not only the design of the homepage, but also the design of the internal pages and the crawling of spider. Many websites only place content on the homepage during design, other websites do not have links to the internal pages. After a spider crawls the content of the home page, he only chooses to jump out of the website. Such a website design only allows the spider to include one page on the home page, other pages cannot be included. For website page indexing, the website's internal pages should not be kept away from the home page 5 clicks, which requires friendly navigation for the website. Navigation on the top topic page of the website. The "more" link of the homepage calls the inner page navigation of the Forum. The inner Page's breadcrumb navigation: homepage -- topic page -- Content Page. At the same time, there are also several taboos in the navigation design: Do not use image navigation, do not use JS navigation to jump, try to use simple article anchor text navigation, friendly navigation design, for the spider, the crawling resistance is the smallest, and the user experience is also the best.

Second: user-friendly page structure design. Many product websites use a large number of JS Code, images, and flash animations to achieve the user's visual effect. These codes are not readable for the spider. The website design should not only take effect into account, better considering the survival of search engines. Using the web page optimization tool released by Baidu, Webmasters can find that Baidu has high requirements on website code simplification. After using JavaScript code, the website must be placed at the end of the Code, reduce the request time of the Spider. At the same time, the CSS style sheets of the website should also be merged to reduce unnecessary code. Do not use the fram framework structure in the website as far as possible, it is difficult for a spider to identify the framework and flash code. Webmasters can use iframe and nofollow tags to avoid unnecessary loss of weights for website ad sections and those that do not need to pass weights.

Third: friendly website jump code. For search engine spider, the only jump that identifies will not be considered cheating is 301 jump, 301 jump can completely transfer the weight to the new website. Website jump methods include 302 jump, JS jump, and meta refresh jump. However, all these jump methods are considered cheating methods by the spider. We do the jump to transfer weights, therefore, there is no need to select a jump method other than 301. ASP code for 301 redirection on windows Host: <% @ Language = VBscript %> <% Response. status = "301 Moved permanently" Response. addHeader "Location", "Domain Name">. PHP code: <Header ("HTTP/1.1 301 Moved permanently"); Header ("Location: domain name");?>.

Fourth: user-friendly static page settings. Baidu's new strategy has high requirements on high-quality website content. webmasters also strive to create more content for their websites. Static Page settings may make websites better indexed. Dynamic addresses can cause inconvenience to crawlers when capturing content. It is easy for them to enter an endless loop during crawling, or they will frequently include repeated pages, to fully include website pages, you must convert dynamic URLs into static URLs during website design. Many webmasters may say that dynamic pages can also be indexed. When crawlers crawl websites, they can identify dynamic addresses, but it may cause difficulties for crawlers to include them, there is actually a good way to reduce the spider crawling trouble, And the webmaster is happy. The website administrator can add the following content to the robots.txt file: Disallow :/*? * Code. It may take some time for the spider to record the repeated content of the website. Don't worry if the webmasters join the website. wait patiently and wait for the database to cache the content.

Webmasters all work hard to operate websites. If these details cause the website to be punished, wouldn't it be crying? Why is it a surprising victory in website design. No matter how Baidu updates its strategy, as long as Webmasters can do well in detail, they can still focus on content construction to improve user experience, so that the survival of natural websites will take a long time.

Original source of this article: http://www.hlqxc.org first red and black alliance, reprinted please indicate the source.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.