Some Supplements to webpage * Static * and Seo Problems

Source: Internet
Author: User
Tags website performance

Http://www.cnblogs.com/JeffreyZhao/archive/2009/07/06/more-on-page-statilization-and-seo.html

The previous article discussing "static pages" received a good response. Many of my friends expressed their opinions and gave Lao Zhao more ideas. Even in the previous articleArticleI have replied a lot of content later, but from past experiences, summing up a new article will make the content I want to express clearer, it is also very important for the concept of "static" which is very easy to be misunderstood.

Let's discuss what static pages are ". Some friends said that the HTM or HTML file stored on the hard disk is a static page. The web server does not need to perform additional processing and can directly read and output the file content, such static files are helpful for Seo. The reason is that the search engine will give a better weight to the file ending with HTML (this seems to be the conclusion, not the reason), and this is "Common Sense ", "People who know a little about Seo know this" and "common practices", so "it must be correct ". In fact, Google does not think so. Baidu does not give a professional explanation.

Of course, we have already stressed that, but we still need to constantly clarify that even if the search engine has a better tendency for "static pages", it is also because of its "url style ", instead of "placing an HTML file on the hard disk ". The requester (that is, the crawler) only sends a URL to the server and obtains the content from the server. It does not care, nor does it know how the server obtains the page content. For clients, there is no static or dynamic page in the world. Some may still say "No, HTML is a static page, and aspx is a dynamic page. The former does not need to be computed on the Web server, and the latter does ".

Is that true? This is not the case because HTML files also need to be computed on the Web server. For example, when you request an HTML file, the web server does at least a few things:

    • If the request contains the cache information, the cache status is processed.
    • Locate the file on the disk according to the URL.
    • Perform user authentication and authorization (for example, are users anonymous ?).
    • Determine whether the read permission is available.
    • Read files.
    • Set the mime value based on the file type.
    • Set the last-modified value based on the last modification date of the file.
    • Set the e-tag value based on the file content and other States.
    • If the file contains the include mark, read the other file and fill it in.

See how many "dynamic operations" are required to process a file. These operations are all done by loading an HTML file on a Web server (such as IIS. If you want to observe these processes, readSource codeOr observe what the system. Web. staticfilehandler class in ASP. NET does. It also reflects the key points when the Web server processes HTML. In fact, if you configure HTML to ASP. net isapi in IIS, or use the web server that comes with vs, staticfilehandler is used to output files on the hard disk.

Therefore, although it seems that the Web server simply reads files on the hard disk, it is actually not as simple as we think. But for the client, this is all unknown. For example, if squid and nginx are deployed on the front-end cache or reverse proxy servers, they do not care about whether the backend Web servers are windows, Linux or UNIX, or IIS or Apache, lightted is even an efficient or inferior web server written by ourselves. For browsers, crawlers, or front-end load balancers, they only know the TCP/IP protocol, and they only know the HTTP protocol and other things.

However, some people insist that "generating static pages" to "cache pages" is helpful to Seo. The reason is that "page caching" can improve website performance, and crawlers prefer to access faster pages. From this point of view, this argument is true. I still don't like this, because it doesn't grasp the key of things. Here, the key to Seo is to optimize website performance, and generating static pages is only one of the methods. This is neither the most widely applied nor the easiest way to achieve it. If you directly associate "generate static pages" with "Seo", it is likely to cause misunderstanding to others.

Of course, if there is no problem with your thinking, the reference of the word "static page" is clear enough. The Proposition "static page is good for Seo" is undoubtedly correct. However, we have not discussed whether the logic of a proposition is correct, and we do not have to worry about the rigor of an expression. Our purpose is to explain the truth. That's why Lao Zhao wrote so much content over and over again. That is to say, the key to these articles is to "clarify the truth". We can grasp it.

Finally, let's talk about his thoughts on Seo.

According to lazhao's feeling of contact with various Seo personnel, they always have various reasons to explain "the problem", but if the problem remains ineffective after improvement, they can also find various reasons to tell you why it is ineffective-but they need to know That Seo is a practical task, and its only judgment basis is "effect" rather than "theory ". Seo theory is easy to grasp, but if it cannot really improve the performance of a website on the search engine, it is still useless. Lao Zhao believes that a good Seo requires at least common sense about website creation or the basic technology of website development. Otherwise, it is basically a piece of cake. Lao Zhao once contacted a "professional" SEO company, where "SEO consultant" impressed me-negative impression. Its "non-professionalism" is evident in the following events:

    1. it is still a "static page" problem. After changing urlto the end of .html, it didn't get any obvious effect. He asked us about our implementation method. When we know that we have used URL rewriting, rather than placing HTML files on the hard disk, this spoofing search engine behavior will play a negative effect. He strongly urged us to place HTML files on the hard disk. This requirement is naturally rejected by us. One of the reasons is that we are a very dynamic website and it is difficult to achieve this requirement. But more importantly, people who know a little about the technology know that, the web server's processing method is completely invisible to search engine crawlers. Whether HTML files are actually placed has nothing to do with search engines.
    2. the location of the content. There is a saying in the SEO field that search engines tend to focus more on the content on the front of the page, while lowering the content on the back of the page. Therefore, the professional SEO consultant pointed to a page and said that this part of content is too "bottom" and is easily ignored by search engines. Note that the content appears below the page ". Do you think this argument makes sense? Nowadays, the page layout usually uses the XHTML + CSS method, while the search engine only pays attention to the HTML content, while the "location" is largely caused by CSS, it is even controlled by Js. The content that appears in the front of the HTML content can also appear in the lower part of the page, which has nothing to do with the search engine. Unfortunately, this is also explained for a long time.
    3. the last one is the most ridiculous. Because the SEO effect is not good, the SEO consultant thinks that he can only "come to reality", so he asks us for the IIS log of the website. Log Analysis is helpful to Seo, because it shows the crawling order, frequency, and even results of crawlers. Therefore, there is no problem in viewing logs. Unfortunately, the other party gave an email address from MSN, asking us to send the logs from the past few weeks to him. When you see this requirement, Lao Zhao almost had to swear. From this point, we can see that the SEO consultant lacks the necessary attempts. He doesn't even know a small or medium-sized website, and generates hundreds of megabytes to several GB of logs every day. Why are there so many "success stories" without common sense "?

Lao Zhao's blog (that is, the one you are reading) is also very bad at the search engine. It is difficult to find several articles on Google even if it is a topic that Lao Zhao often writes, the ranking is not very high. If you do not use site: cnblogs.com for restrictions, almost no article will find my blog, which is reprinted in various places. I am also worried about this. I have consulted some SEO experts and I still haven't improved much after making some changes. However, I believe that I have not met any good Seo personnel. The potential of my blog is far from being explored.

If you are a professional Seo engineer or a professional SEO company, you may wish to give me some suggestions-if you can, I don't mind making any investment in this area. However, some "dirty" optimization methods are not required, such as posting links on the Forum and sending spam. I also know that these practices are very effective, but I don't want to do this.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.