Web directory structure shtml and HTML optimization analysis

Source: Internet
Author: User
Keywords Search engines all for

I am using the easy to move the CMS, the function in some aspects, although some deficiencies, but for most webmaster, is enough! The only regret is that the dynamic in SEO is not very ideal, the previous period of time to see the dynamic webboy on the CMS optimization method, which refers to the original static page HTML generation way, to shtml generation mode, more conducive to search engine optimization.

HTML (hypertextmark-uplanguage), or Hypertext Markup Language, is the most widely used language on the Internet, and it is also the main language that makes up Web documents. The purpose of designing an HTML language is to be able to easily link text or graphics stored in one computer to text or graphics on another computer, to form an organic whole, without having to consider whether the specific information is on the current computer or on other computers on the network. We simply use the mouse to click an icon in a document, and the Internet immediately goes to the content associated with this icon, which may be stored on another computer on the network. HTML text is a descriptive text consisting of HTML commands that can describe text, graphics, animations, sounds, tables, links, and so on. The structure of HTML consists of a head, a body, two parts of the head, which describes the information required by the browser, and the body contains the specific content to be described.

sHTML and ASP are similar, in shtml named files, using some of the SSI instructions, as in the ASP instructions, you can write SSI instructions in the shtml file, when the client access to these shtml files, the server side will read and interpret these shtml files, Explain the SSI instructions contained in the sHTML file.

HTML methods commonly used to invoke related articles, hot articles or recommended articles, most of the use of JS method calls, and we are very clear search engine for JS way of call spiders are not climbing, for search engine unfriendly, so, At that time I used the IT workshop electronic books This two-level son station to do a SEO experiment, a few months down, from which also found and experience a lot of experience!

1, Home section: Search engine direct access to http://book.ithov.com and then from the link to the home page to level two directory and Homepage content page crawling, when the home page exists JS call, spiders are not recognized, so the home page to http:// Book.ithov.com/index.shtml form, will be inside the original JS call the way to <!--#include file= "xxx.html"--> this way, search engines can identify the contents, and in-depth to the content page to crawl.

2, Column page: Search engine into the column page crawling, and the form of column page access I was used in the http://book.ithov.com/List/List_341.shtml, which also adopted the <!--#include file= "xxx.html" --> this way to invoke the latest articles or recommended articles, generate the included HTML file, you can refresh all the column page with include calls to update the way, without the need to generate all the column page, which for the column set more users, There is no doubt a good way to save a lot of time to regenerate static pages.

3, Content page: All the Site content page weight Although very low, but for search engines, the site before the interconnection interoperability is also very critical, and the use of JS way to call the HTML page, there is no doubt in this respect many drawbacks, can not let the search engine identification, also can not let the site content page between the information transmission, So I'm using the http://book.ithov.com/2008/200807/ebook_29773.shtml this form of shtml generation, the content page of the <!--#include file= "xxx.html"- -> files, every time you refresh the contents, all content pages can be updated to the latest content, without all the static content pages being generated.

Believe that through the above improvement, your site for search engines and site internal links optimization will be more reasonable, of course, spiders will continue to every day in your site before each other crawling. These are some of my experience to share, welcome you to specify!

Author: Sword Tree Source: It square information Network http://www.ithov.com Copyright All welcome the media to each other, please specify the author and the source!

Next article I will focus on analysis of shtml and HTML conversion, on the observation of some of the search engine trends and pros and cons of experience!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.