Summary: Many webmasters have the habit of viewing the Web site log, the main purpose of this is to analyze the spider crawling site, of course, most of the webmaster is just a simple understanding of the day spider crawling times, this is enough to find the problem. Yes
Many webmaster have to check the habit of the site log, the main purpose of this is to analyze the spider crawling site, of course, most of the webmaster is just a simple understanding of the day spider crawling times, this is enough to find the problem. Sometimes, we will feel that their site has been quite perfect, and the normal optimization process, the spider is not cold, let a person is very anxious.
Understand the principle is very easy to understand, search engine with spiders in the internet this big network search "prey", it can be based on clues is linked to find new links, according to the complex algorithm finally ranked the site, stored in the database. Many websites may lose the spider visit because of the website structure confusion, however the spider does not visit the website the factor to have many, below we together summary:
Too much content such as pictures, flash, etc.
Have to say, the search engine developed to now, has been compared to the intelligent, last year, Google launched the image search function, can analyze the color of the picture, pixel ratio and find the source of the picture. However, the technology for search engines to fully identify the information in the picture there is no small gap, and a lot of webmaster or do not know search engine optimization, or consider the site beautiful and rich content, often in the page embedded a large number of high-definition pictures, flash, even if your site content is good, spiders can not know, can only bypass the line. Now the network has a lot of simulation spiders crawl tools, there is a similar situation webmaster can query their website information, to see what spiders can find information and people see how big gap.
Too many site dead links
We can imagine that the spider cheerfully came to your website, start a link search, thought found a bunch of good content, but is a bunch of open links waiting for her, once, twice, three times, every time this is the case, so she was angry, will never visit your page. Most of the sites are dead links, as long as we can be found in a timely manner and cleared away the problem, you can use the powerful Xenu, you can also view your Web site log, found that return 404 status code to clear.
The image above is the interface used to scan a Web site with Xenu.
Outer Chain and Nofollow
The reason spiders can successfully find and access your site, because there are links to other sites on your site. Want to more frequent spiders, in the establishment of the chain should be considered outside the chain site weights, the number of recorded, the higher the value of the PR, snapshot update faster site weight is higher. In addition, some unscrupulous webmaster in the friendship link, will deliberately set nofollow, such a chain in the spider's Eyes in the form of a fictitious, if your site outside the chain of this phenomenon, please promptly removed.
Web site code Complex
Website construction, code optimization is very important, lengthy code on the site meaningless, but also affect the speed of opening and search engine spiders judgment.
The logic structure of the website is complex
The correct site logical structure should be flat-shaped tree structure, that is to say, the spider through the home page, can reach any one of the column pages, through the column page can reach any content page, this number should be limited to 3 times (the larger site can be more than a few layers). If the site is too many levels, the weight of the site is not high enough, spiders are not interested in a layer of search.
Site Map Error
Website map is a good tool for the search engine to fully understand the site, for spiders, the site map is like a roster, good site map can make spiders have a point of access to any page in the site. So, we need to be cautious when making sitemap. There are also many tools on the Web to make site maps, and to ensure that they are foolproof, you can also visit them as visitors to verify them.