Let's start by talking about what is called a "static page."
Let's start by talking about what is called a "static page." A friend said, put on the hard drive of the HTM or HTML file is a static page, the Web server does not need to do additional processing, directly read the file content and output can be, and such a static file for SEO is helpful. As for the reason, it is the search engine will give the HTML end of the file to a better weight (this seems to be the conclusion, not the reason), and this is "common sense", "people who know a little seo know this", "people generally use the practice", so "it must be correct". But in fact, Google does not think so, Baidu did not give a professional statement.
Of course, we have repeatedly stressed, but still need to keep in mind that even if the search engine for "static page" has a better bias, it is because of its "URL style", rather than "put an HTML file on the hard disk." The requester (that is, the crawler) simply sends a URL to the server and gets the content from the server side. It does not care about, and can not understand how the server side is how to get the content of the page, for the client, the world there is no "static" or "dynamic" page points. Some friends may still say "no Ah, HTML is static page, like ASPX is a dynamic page, the former does not need to operate on the Web server, the latter need."
Is that true? This is not the case, because HTML files also require a Web server to compute. For example, if you request an HTML file, the Web server does at least a few things:
If the request contains cached information, then the cache state is processed.
Navigates to a file on disk based on a URL.
Conduct user authentication and authorization (e.g., anonymity?).
Determines whether permission reads are available.
Read the file.
Sets the MIME value based on the file type.
Set the Last-modified value based on the last modified date of the file.
Sets its E-tag value based on the contents of the file and other states.
If there is an include tag inside the file, read another file to populate it.
See how much "dynamic" it takes to process a file, which can all be done by loading an HTML on a Web server such as IIS. If you want to observe these processes, you can read the source code for some Web servers, or look at what the System.Web.StaticFileHandler class does in ASP.net, which also embodies the key to Web server processing HTML. In fact, if you configure HTML to asp.net ISAPI in IIS, or if you use a Web server with VS, then the file on your hard disk is exported by StaticFileHandler.
So, while we look like the Web server simply reads the files on the hard drive, it's not as simple as we think. But for the client, all this is unknowable. For example, a cache or reverse proxy server that is deployed to the front end of the Squid,nginx does not care whether the backend Web server is Windows,linux or UNIX, nor does it care about Iis,apache, Lightted is even an efficient or inferior Web server that we write ourselves. For browsers, reptiles, or front-end load balancers, they only know the TCP/IP protocol, they only know the HTTP protocol and so on, and other things.
However, some friends insist that "generate static pages" to "Page caching" is helpful to SEO. The reason is that "do page caching" can improve the performance of the site, the crawler is more likely to visit faster pages. From this point of view, this statement does have some truth. But I still do not like this view, because this argument is not grasp the key to things. Here, the key to SEO is to optimize the performance of the site, while generating static pages is just one of the means. This is not the most adaptable and is not the easiest to implement. If you connect "generate static page" to "SEO" directly, it is likely to cause misunderstanding to others.
Of course, if your ideas are no problem, the "static page" three-word reference is also clear enough, "static page for SEO" This proposition is undoubtedly correct. However, we do not discuss whether the logic of a proposition is correct, nor do we need to dwell on the seriousness of an expression, and we aim to justify it. Because of this, Lao Zhao will write so many content over and over again. In other words, the key of these articles is "to clarify the truth", we grasp it both.
Finally, Lao Zhao again talk about SEO this work of view.
From the old Zhao and the SEO staff contact feeling to see, they always have a variety of reasons to explain the "problem", but if there is no effect after improving the problem, they can find a variety of reasons to tell you why there is no effect-but to know that SEO is a practical work, it's only to judge the basis is " Effect ", not" theory ". SEO theory is easy to master, but if you can not really improve the performance of a Web site in search engines, all this is still in vain. Lao Zhao believes that a good SEO is necessary to understand the production of Web pages, or Web site development of the basic technology, at least have common sense, or basically is in the egg. Lao Zhao once contacted a "professional" SEO company, where the "SEO consultant" left me a deep impression-negative impression. The "unprofessional" is evident from the following events:
Still a "static page" problem. He asked about our implementation, as it did not have a noticeable effect after the URL was turned into an. html end. When we learned that we used URL rewriting instead of placing HTML files on our hard disk, he exclaimed, "This deceptive search engine behavior can be counterproductive." He strongly asked us to place an HTML file on the hard drive. This request has naturally been rejected by us, one reason is that we are very dynamic Web sites, it is difficult to achieve this demand, but more importantly, the knowledge of a little technical people know that the Web server processing methods for search engine crawler is completely invisible, Whether we actually place HTML files has nothing to do with search engines.
The location of the content problem. In the SEO industry there is a saying is that the search engine will be more inclined to put the content of the page to look heavier, and the page after the content of the weight of the lower value. So the professional SEO consultant pointed to one of our pages said, this part of the content too "below", it is easy to be overlooked by search engines. Notice that "the content appears below when the page is displayed." Do you think that makes sense? Nowadays page layouts often use XHTML+CSS, and search engines only focus on HTML content, and "location" is largely controlled by CSS, or even by JS. The content that appears in the front of the HTML content can also appear at the bottom when the page is rendered, which has nothing to do with the search engine. Unfortunately, this has been explained for half a day.