If the search engine does not have a good tour of the content of our site, then we even invest in the site of how much energy is naught. The best way to avoid this is to be able to fully plan the structure of our entire site.
First, we begin to build our site before, we need to go to a good analysis of search engine crawling patterns and laws, because we know that search engine is the use of "spiders" crawling our site source code to crawl links, so good to collect our site page, so that warehousing to the search engine database, This is the search engine summary of the process, while the search engine will be based on a certain algorithm, such as the speed of the page, social signals, such as the allocation of weights and rankings, these are the site before we need to understand.
If the search engine spider if can very good visit, browse, crawl our page, the inevitable to our site's weight ranking will have a big promotion, so how to let search engine love your site? The following author enumerates the author's SEO Walkers station five operations.
(i) Simplifying our navigation
Believe that a lot of webmaster in the construction site and the author of the same in the navigation design of this particular tangle, because the navigation settings for the site as a whole site weight transfer and user friendly experience are extremely important, and if we are responsible for navigation settings, the code will inevitably be more responsible, Search engines for more complex code crawling is usually difficult or not easy to crawl, and complex navigation can not allow users to quickly find the content, is undoubtedly a great blow to the user friendly experience, for this, if you want to let search engine spider Love on your site first step, first of all, it is necessary to simplify your navigation bar.
Simple processing method: As much as possible to simplify our site navigation, so that users can click within three clicks to find the directory, we can again dominate the navigation set down, so that can be a good show level three or four directory, but not to make the page bloated.
(ii) Minimize the number of site content display pictures and script files on
We know that the search engine crawl principle is the use of virtual tools "spider" to explain the text, script-oriented page content, however, the current search engine technology, for those flash, the content of the search engine is still unable to achieve recognition, so this is no doubt the site UI designers a big worry problem.
Simple processing method: the use of some of the transformation of the content of the form of the site code can let search engine identification, and we can use Spider Simulator to simulate spiders crawling our site and to observe, if found in the crawl has too much content lost or can not crawl, we need to make changes.
(iii) Do not incoherent link operation
As we build the site's inner chain, we must be very careful to name, because we all know that search engines can not be as intelligent as people to judge the standard of thinking, it is usually through our URL for the standard to judge, sometimes two different pieces of code is really linked to the same URL address, Then the search engine will certainly not find the north, in the end which is the content you want to link the page to show the content? Although we may look very simple, know the logic, but after all, search engine search engine is not enough user-friendly, so many times we still need to follow the form of search engine love to link.
In order to avoid guiding the search engine not to be able to judge content, so we have to use consistent code to point to the link, let us point to the content of the link to uniqueness.
(iv) Rational use of 301 redirects
301 redirect is a technique that we will often use. So when are we going to use 301 redirects? First of all, we play the role, it is the search engine crawling page, to jump to the page we point to, usually we are used for domain name redirection, without the WWW address redirect to the top of the WWW, actually, In many cases, our site will appear duplicate content, and perhaps these duplicate content will be included in the search engine, this time will produce a lot of garbage page, this time if the direct deletion may produce more dead links, then we can reasonably use 301 redirect, Jump a duplicate page onto another page, not only to avoid duplication of content, but also to reduce the resulting dead links. But one thing we need to be aware of is that you don't use 301 redirects for multiple degrees.
(v) correct use of Sitemap
If you want to include a better site, the search engine more friendly, Sitemap is a search engine can crawl crawling a good way, but if the use of incorrect, said a wrong sitemap on our site is extremely unfavorable to crawl, So we have to ensure that the Sitemap's indication accuracy, of course, now the general CMS backstage are generated from the Sitemap, so generally we are a key generation can be. Of course, if your site is running on a number of platforms, then we need to download some automatically generated site sitemap plug-ins, if not, we can use HTML code to manually build a sitemap page, and after the completion of the search engine can be submitted.
Summary: Usually the search engine does not like the site, in addition to site content is not original or excessive collection, generally are these five kinds of situation, of course, there are some details of the wrong place, but after all, the situation of each site is different. Article by the web game http://www.2763.net Webmaster to provide, reprint please retain the source.