Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
Today we'll talk about what the program should pay attention to when building a website:
3. Important pages, should be able to find from the site relatively shallow, to ensure that each page can be reached through at least one text link.
Note: I think Baidu means that the simpler the Web site structure is more conducive to Baidu crawl (do not have a div or table inside the countless div or table)
4. Try to use text instead of flash, JavaScript, etc. to show important content or links, Baidu temporarily unable to identify flash, JavaScript content, this part of the content may not be able to search in Baidu; JavaScript contains links to the page, Baidu may not be included.
Note: This is to tell us, if you stand flash\javascript a lot of words it is best to do a page with text links link to your flash, jaascript inside the link to the address.
5. As little as possible using frame and IFRAME framework structure, the content displayed through the IFRAME may be discarded by Baidu.
Note: As little as possible, from this sentence I understand that can not be used in large quantities. You can write a little bit of HTML code. Add a few more IFRAME then open the Web page with the process Manager to see, you can understand why spiders do not like a lot of iframe site.
6. If the website uses the dynamic webpage, reduces the parameter the quantity and the control parameter length will be advantageous to collect.
Note: Reduce the number of parameters and the length of the control parameters will be beneficial to the collection! When doing a station many friends who do station say to the whole station to generate static page (they say spider like static page, but will give up dynamic page). I really believed it! Today in Baidu help to see this sentence I finally understand why spiders prefer static page.
Please note: "Reduce the number of parameters and control the length of parameters" This sentence! The person who writes the program knows: the more parameters, the longer the length. The greater the chance of a dynamic page failure! So you may not understand, let's make an analogy: a spider is a person, the network is an area of countless hutongs, Dynamic page error is a dead end. You say a person several times to here is a dead end, do you think this person will come here again? Then why does the static page spider like? That's because static pages can be accessed (it's not a dead end) as long as your server doesn't delete the page! If you have a large number of servers to delete static pages, not how long you will find that in fact, Baidu is not a preference for static or dynamic, Baidu is the preferred site can be normal access, will not let him into a dead end of the site.
Welcome reprint but please keep author: www.lt77.com