Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Recently do the new station Seoer found Baidu began only included in the home page is not included, in 1, 2 years ago Baidu is very soon began to include, regardless of the quality of the article just quantity, but now whether the webmaster found included in the page has begun to increase a lot of difficulty, or even a two months only a home page is included, The inside page is completely invisible? The author estimates that this should be the latest year of Baidu's algorithm to adjust the result of the first observation of the quality of the site, and then decide whether to release the inside page and give rankings.
1, Baidu Trust period
When Google was in the country before, a new site will appear a sandbox period, this period can generally reach more than 6 months, within 6 months do not give new station any weight, now Baidu also began to imitate Google algorithm, let the site first build trust, observe the quality of the site, and then decide whether to give out the inside page and ranking. If a new station can not adhere to the "trust period" is likely to be directly abandoned by Baidu. The most simple check whether Baidu crawl page is the way to see the site log return code, is 200, or other code, if the return of 200 means Baidu search engine has been crawled, and put to the search engine database.
2. Content quality problem
The real test site quality is the content of quality, if the content of the site can not meet the requirements of Baidu, or the quality is too low, Baidu directly give up the site. From the search engine in the last two or three years of the algorithm can be seen, Baidu more and more care about the user experience, the user-oriented site, as long as the persistence will succeed. On the contrary, relying entirely on collection, pseudo original tools, such as Web sites, K station has become a matter of time. Content quality should pay attention to the quality of the article, readability and relevance, and so on. The high quality original article is the search engine and the user likes.
3. Internal structure
The internal structure is important, if the initial internal structure using a large number of js,flash,iframe frame structure, etc. unfavorable to spider crawling, these code although can make the website look more exquisite, but completely unable to let the spider read content, If it is an excellent seoer should understand that JS is not allowed by Baidu, Flash is unreadable, frame structure has a better div replaced.
4, Robots.txt Agreement
Robots.txt protocol is a common agreement for all search engines, the main meaning is to prohibit spiders crawling, this protocol is generally used to block out a useless link, such as: Copyright information, contact details, company introduction and so on. This allows the inner page weights to be set. However, the robots.txt protocol can not be used indiscriminately, such as accidentally shielding the home page, or after the hacker attack modified, may cause spiders no longer come. Often check the robots.txt agreement is the webmaster must do.
Summary: As long as the site does a good job, the structure is not unfavorable to spiders, robots.txt protocol correct, server space can be opened, the quality is completely original and high, then if you now only included the home page, please do not worry, efforts to insist that Baidu is sure to release all of your high-quality inner pages.
This article from Cheike seo:http://www.xiekaiseo.com/post/63.html reprint please specify