650) this.width=650; "src=" Http://183.61.143.148/group1/M00/01/FD/tz2PlFQJWkqC7EEGAAA09Xmt0bo668.jpg "style=" border:0px; "/>
seoer You need to check the Web site's server log regularly to get a handle on where and what page the spider crawled from our site. But sometimes it is found that spiders crawl a few pages that our site does not exist, today SEO tutorial
1 , how spiders find links to our website ?
We all know that the spider is crawling along the link crawl, he will automatically extract all the links in the site, save the storage, and then crawl, which is why we feel that the site contains less or just released articles are not included, we will say "primer spider", in fact, the spider is hair outside the chain, When the link is discovered, the spider will crawl and then, after a series of complex algorithms, decide whether to release the page.
2 , why do you crawl to a page that doesn't exist ? ?
In general, there are so few questions
a , by the opponent malicious brush outside the chain, that is, the competitor malicious to you sent a lot of non-existent page links, when the spider found to crawl
b , the outside chain, and later because of the site revision and some links did not remove timely, spiders regularly visit the time to crawl the link
C , this type of old domain name only, the previous site structure and our current site structure is inconsistent, also similar to b in the argument
D , Robots The file doesn't restrict spiders .
e , the cause of the program causes the search engine spider to fall into the black hole
F , URL Submit or Ping to Baidu address is wrong
Spiders crawl does not exist in the page, in general, because of these problems, the most important reason is the chain part, so we have to regularly check the external chain of our website.
these aspects, generally speaking, we only need to do this, the basic can reduce the occurrence of this situation, if it is caused by the link, then go to the Baidu Webmaster platform use outside the chain reject tool rejected, and submit the site of the dead link, while using Robots Block Spiders crawl these contents, and if it is a program problem, fix the program.
If you want to know more about the relevant knowledge and content, e Mentor Network for you to recommend the appropriate SEO advanced Tutorials .
Spiders crawl Web sites and crawl pages that don't exist.