Optimization strategy of website structure

Source: Internet
Author: User
Tags add date format file url domain domain name root directory

The site's internal structure optimization to the site construction to play the role of planning and guidance, the content of the site maintenance, SEO plays a key role. The following mainly from the site structure, page elements, navigation structure, later optimization and other aspects of the site to describe the internal optimization, from the site construction of the early stage for SEO optimization, later maintenance to provide convenience and foundation.

  URL address optimization

Creating a well descriptive taxonomy name and file name for the files on your Web site not only helps you organize your site structure, but also enables search engines to crawl files more efficiently. Although search engines can crawl complex URLs, providing relatively simple URLs is helpful for both users and search engines.

The main way to optimize the URL structure is that the site always use a URL address, do not change the address back and forth, if the domain name with the WWW address, the domain name without the WWW use 301 redirect technology Redirect to the main address, to avoid the use of uppercase and lowercase URLs, The URL of the website uses static URL as far as possible, avoid using dynamic Url,url the shorter the better, the content of the URL uses Pinyin or English best, convenient memory.

For dynamic Web sites, the use of pseudo static technology can make the outside of the site looks like static web sites, conducive to the search engine index.

  Directory structure

The directory structure is best to use the first level to the two level, not exceeding three levels. The organization of the table of contents as far as possible using: Home-section page-content page. The directory name is in Pinyin or English. Avoid using multiple nested subdirectories.

  Robots.txt

Robots.txt is a text file stored in the root directory of the Web site that tells the search engine's crawler (spider) which content is not indexed by search engines and which can be indexed. The robots.txt protocol is not a specification, but a conventional one, which is usually recognized by search engines, but there are special cases.

When we do not want some pages in the site to be crawled by search engines, perhaps these pages in the search results are not very useful for users, you can put these pages in the robots.txt, which can ensure that some of the content of the site does not appear in the search results.

  Sitemaps

Sitemaps can be convenient for webmasters to inform the search engine on their site on what can be crawled on the Web page, a bit like the blog RSS function, is a convenient to their own services, if you (all the information on the Internet) have adopted this way to submit their own updates, Search engine will no longer have to send so many reptiles toil everywhere, any one site, as long as there is an update, will automatically "notify" the search engine, convenient search engine indexing.

Sitemaps files are usually in XML format, the format is simpler, the Sitemaps protocol starts with a start tag and ends with a end tag. Each URL contains a entry as a parent tag. Each parent tag includes a child tag entry. is the Web page file URL. This URL should start with HTTP and be less than 2048 characters. represents the date the file was last modified. This date allows you to delete the time section, such as YYYY-MM-DD.

At present, Baidu, Google, Yahoo, Microsoft and other search engines are supporting the submission of sitemaps,sitemaps through each search engine webmaster platform to submit, You can also place the address in a robots.txt file for search engine queries by adding Sitemap:http://www.domain.com/sitemap.xml to the last line in robots.txt.

  Website navigation

The navigation of the site is important to help users quickly find what they want, and it's also important to help search engines understand what's important about the site. A site map is a hierarchical list of pages used to display the structure of a site. The main purpose of site navigation is to facilitate the user, but at the same time it is also conducive to the search engine on the entire Site page more comprehensive crawl.

The main site navigation methods are to create a natural cascade structure of the Site Map page, this navigation page can make it easy for users from the backbone of the page to find the specific content they need, if necessary, to ensure that the internal link structure of a reasonable basis to add Navigation page.

Using breadcrumbs (breadcrumb) navigation mode, breadcrumbs refer to a row of internal links placed at the top or bottom of the page, which makes it easy for users to return to a page or homepage in the previous structure. Most breadcrumbs usually start with the most general page (usually the homepage), and the more specific the content of the page to the right, such as "Home-" column-"specific article title".

  The use of nofollow

Nofollow tag is a Google-led new label, the goal is to minimize the impact of spam links to search engines, reduce blog spam, currently Baidu, Google, Yahoo, Microsoft support this tag. When the nofollow tag appears in the hyperlink, the search engine does not consider the weight of the links or use these links for rankings.

There are usually two ways to use nofollow tags: one way is to write "nofollow" on a meta tag on a Web page that tells the search engine not to crawl all the external and internal links on the page. For example: . Another way is to put "nofollow" in a hyperlink and tell the search engine not to crawl a particular link. For example: .

It should be noted that if a site links to some of the search engines are considered spam sites, then the weight of the site will also be affected. So for a website, all hyperlinks that are likely to be submitted by a third party should be added to the nofollow tag for security purposes.

  404 page

404 A page is a page that a user attempts to access a Web page that does not exist (because the user clicks on a corrupted link, the page has been deleted, or the user has entered the wrong URL). The 404 Web page is called because the Web server returns a 404 HTTP status code to indicate that the page was not found because of a request for a missing web page.

Occasionally, users may visit pages that do not exist under the site's domain name by clicking on a broken link or by entering the wrong URL. Using a custom 404 Web page can effectively help users get back to effective pages in the site, greatly improving the user experience. 404 Web pages are best provided with links to the homepage of the site and links to popular content pages in the site.

It is usually best for users to create a custom 404 Web page In order to obtain this portion of user traffic. A good custom 404 Web page can help users find the information they need, provide other practical content, and attract users to further browse the user's Web site.

However, many web site design of 404 pages are very simple, many sites in order to not lose traffic, in 404 pages automatically redirected to the home page, this is not a way to improve the user experience design.

  Use of statistical code

Web analytics tools provide insight into how users find sites and what they do, and find the most popular content on the site, as well as the effects of optimizing the site (for example, modifying titles and descriptive meta tags to help increase traffic from search engines). To figure out where the visitors are coming from, what the visitors are looking for, where the visitors are coming from, and which pages to exit from.

Use the Site statistical analysis tool is the premise is to add statistical code on the site, the current more common statistical tools have Baidu statistics, Google Analysis and so on. In order not to affect the load speed of the site, the statistical code is best placed at the bottom of the site.

Add a good statistic code, can be based on the site statistics system analysis, to see whether the page needs further optimization.



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.