Web site Internal structure optimization

Source: Internet
Author: User
Keywords Structural optimization

Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall

Now the large web site, in fact, there are very powerful SEO construction team, however, according to Q Pig know, there is no well-known large web site, can be a good solution to the structure of the site, search engines on the Site page included, there will be a limit, to break through this limit, or to build a huge external chain system, For example, from a collection of 10 million pages, mentioned to include 30 million pages, another way to improve the collection of pages, is the optimization of the site structure.

How the site structure is optimized involves the following six parts:

1, column Design

Analysis: Website column design, in front of market research and competitor analysis, has already mentioned, the column design, is the keyword investigation result. Q Pig Before the case analysis Robin Teacher's Square dance, there are see, then Robin on the square dance key words after the study, the definition of columns include, square dance team and square consultation.

2. Navigation Design

Analysis: On the site, a page, the location of the entire site, is to rely on the navigation system to reflect. Site navigation, as important to users and search engines, from the search engine to a visit, directly into the site's internal page, this time, only through the navigation system, to let search engines or users know, now access to what page, where the site, is the directory page, or content page, the superior page is what, What is the subordinate page?

The site's navigation design is clear or not, to visit the user, is very important for Baidu to clarify the structure of the site, the direction of the weight of the site, have a decisive impact.

One thing to note here is that in the entire navigation, do not accumulate keywords, such as a lot of webmaster, like in the breadcrumbs, pointing to the home page anchor text, written on the site's core keywords, such as Q Pig Literature station breadcrumbs, home page like to write a beautiful romance, in fact, for search engines, The main role of navigation is to make the search engine understand the location of the page in the site, rather than to the search engine to pass the site's keywords.

So, Q Pig's suggestion is, whether is the website navigation, or the column navigation, the homepage anchor text directly writes "The homepage" the best. For many webmasters like, at the bottom of the site, making and top navigation similar to the anchor text link, q Pig is also recommended, the bottom as far as possible not to accumulate keywords, or even the bottom of the navigation, can be undone.

3. Spider Traps

Analysis: The so-called spider trap is, will hinder the search engine Spider program crawling Web page display technology, such as using Javescript, Flash, IFRAME, frame and session ID to do site links, or some programs will make the Web site crawling spiders into a dead cycle, Like a perpetual calendar.

In fact, for these Web page display technology, is entirely using div to performance, such as many webmasters like to use pictures as a site navigation, this use of CSS style sheet can be replaced, and CSS on the search engine will be more friendly.

4, prohibit the inclusion

Analysis: The site structure, there are some pages, such as about us, contact us, because of the needs of users, must appear in the site, but for the weight of the site, these pages will not be any help at all, to some extent, can be banned from these pages included.

Prohibited methods, mainly, using robots.txt files, noindex tags

One thing to note here is that the most important role of a robots file is to prohibit search spiders from crawling content on the page, rather than banning search engine indexing, If a website A is in robots.txt, all search engines are banned from crawling the site, but in another search engine has been included and crawled on the site B, there are links or anchor text point to Site A, if the search engine crawled and included the link, then the Search engine index library, there will be a site index, the content is the site B link a , the use of anchor text, a site itself content, will not be stored in the search engine database.

While another noindex tag is added in between, the main role, and robots.txt exactly the opposite, is to allow search engines to crawl the content of the site, but do not index. NOINDEX label role has two aspects, one is not to the link vote, increase the weight of this link, the second is to add noindex part of the content does not participate in the site rankings, easy to focus on the weight of the site.

Another well-known label is, nofollow tags, for the site, nofollow label the biggest role is to save the search spider crawling time, if the site has a large number of unnecessary pages, then, the use of nofollow can make spiders have more time, To crawl other pages with important content.

And for search engines, because of Google Spiders, page crawling ability is very strong, so, even if not using nofollow tags, site, a lot of pages, can be crawled to, but for Baidu, the use of nofollow benefits, it is more obvious.

5. Internal links

Analysis: Internal links refers to the same site under the domain name of the content of the link between the page, for the site, there are mainly the following two functions, one is to provide a site page included in the path, such as the page text in the link, for the site, these links, can be a better auxiliary site pages included, of course, in addition to the text appears in the link , the relevant recommendations below the text, is a good internal link location, because the main role of the internal link is to increase the collection of sites, so the relevant recommended content, one is to choose which is not included in the article, the other is as far as possible random;

Another function is to pass anchor text, in navigation design, Q Pig suggested that we do not accumulate keywords, especially the home page, but, in the internal link, can be in the content of the article to add some anchor text, these anchor text although on the importance, may not have external links anchor text important, after all, this is just the site itself said, But, at least, it's a hint to the search engine. For example in the Q Pig Blog article body, if mentioned Q Pig, choose to some of the links, search engines may realize that this blog blogger called Q Pig.

6. Website Normalization

Analysis: Web site normalization refers to the process by which search engines pick the best URL URLs as real URLs. http://www.domainname.com;http://domainname.com;http://www.domainname.com/index.html;http://domainname.com/ Index.html These four URLs are returned in the same content. People who do not know about the website, especially some company managers, may think that through what path into the Web page has any relationship, as long as the content can not be seen.

In fact, for search engines, these four addresses, the meaning of the representative is completely different, many URLs but the same content, will search engine crawl caused trouble. For the issue of Web site normalization, the general recommendations are mainly two:

1 You use only one URL inside your site when linking to other pages, especially the home page. Whether it includes www or not, you should use only one version from beginning to end. So the search engine will know which is the normalized homepage URL.

2 but you can't control which Web site you use to connect to your home page. So you should be on your host server, put all the URLs that are likely to be homepage URLs, do 301 redirect to the URL version of your chosen homepage.

In fact, Web site normalization, the most reliable way, is, from beginning to end all adhere to a URL, whether dynamic address, static URL, when necessary, can use pseudo static way, the site all the URLs are unified, such as Q Pig Literature station address like.

Regarding the website normalization, also involves one question, is the content to go heavy, maintains the article the originality, has the interest friend, may see another article of Q Pig: How to guarantee the website article original nature. Website structure optimization, the core point is mainly two points, one is how to access the site's users or search engines, very clear understanding of the structure of the site, another point is how to highlight the site's focus on the recommended content. This article by Tongxiang SEO (http://www.seozoro.com/) original release, respect the copyright, reprint please indicate the source.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.