ASP and other dynamic language sites to do SEO site search should pay attention to the problem

Source: Internet
Author: User
Keywords SEO Site Search

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

There are a lot of web dynamic languages, asp,php,.net,jsp, and so on in the title of the emphasis on ASP, because the current market most of the enterprise site or using ASP to do, the language because of the low threshold of learning, and access to the perfect match, So it's the enterprise site language that most programmers prefer. We do not discuss the security or technical aspects of ASP in this article, I only in this article to share the latest learning concept, is the ASP site search function on the SEO impact.

Because the SEO of static language included a certain priority (although not absolute, but because of the same configuration, static page access faster than the dynamic page, so from the user experience point of view, Baidu is the optimization of the collection and ranking, and now most of the network companies working programmers are beginning to contact the concept of SEO network marketing, So some programmers in the site planning will generate static, but there is a contradiction, is the site search, especially for some products or news more than the site, this function is crucial, due to the transmission of data problems, can not be absolute static total station, or pseudo static, Either XML is used as a small database to filter, but in essence, dynamic.

This site search page produces results page is often a high degree of repetition, or a high similarity, do not understand this meaning of friends, I give an example:

For example, Taobao has 1000 computer products, and then search notebooks, or search 14-inch notebooks, out of the result is similar, here is just two keywords in search, as the product data becomes larger, can search for similar results of the keyword will be more and more, then these keywords search results page, Their similarity is extremely high, even has the repeatability, the nature, the Baidu does not like these pages.

It's just a concept you don't like, really understand, from the search engine principle analysis, we will clearly know that the Internet on the daily update is huge, but Baidu on one, he sent out the "spider" Crawl page and Analysis page, all need time, and because these pages to the database screening, The time spent is to analyze the other pages of several times, such as "spider" After the patient analysis, also found that these site search generated pages have very high similarity, so it is easy to imagine, Baidu will not because your site generated a lot of this page and feel your site size is larger, conversely has a negative impact, because you wasted its time, And the key is that these pages do not give your site a lot of rich content.

So then, is there any way to deal with this situation? Because the vast majority of websites now have this function of site search, naturally there is such a disadvantage.

I saw a lot of domestic more well-known SEO people's articles and interviews, they also mention this situation, but so far there is no good way to solve.

The root cause is as follows:

If you use a robots to direct the spider not to crawl these pages, this is feasible, but we have to know that before these pages, the spider is a step by step in the structure of our station, and so caught these pages, affected by the robots, as if we broke its path, this is a metaphor, The actual situation, the weight of the station needs to be passed, forming a cycle, need to cut off, so that the weight has come no back, a bit like the "black hole" in space. So no matter you use a robots or some other means, you can make spiders do not crawl, but can not let the weight of reasonable transmission.

To sum up, the site search is still SEO workers in a piece of chicken, at least in the current search engine algorithm can not be a perfect solution.

However, SEO technology is constantly maturing, search engine algorithms are increasingly human, we know the essence of the principle, although there is no solution, but does not represent unable to solve.

On the one hand, we expect the search engine itself to coordinate this problem, on the other hand, we also explore reasonable SEO solutions.

"Respect for original, share ideas." From the Open Sesame Network Technology original article, reproduced please indicate the origin of the article-http://www.51zmkm.com/news/15.html "

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.