Abstract: Today's SEO, has long been not a simple hair outside the chain, the adjustment of the structure of the site to do the ranking to do it. SEO is the pursuit of accurate flow, the ultimate goal is to make these precise flow can bring orders. And the patience of netizens is often not so good,
Today's SEO, has long been not a simple hair outside the chain, the adjustment of the structure of the site to do the ranking to do it. SEO is the pursuit of accurate flow, the ultimate goal is to make these precise flow can bring orders. And the user's patience is often not so good, if the page is slow to open, it is likely to lead to Internet users even the Web page is not fully downloaded and opened it to shut down. Therefore, in today's network marketing, Web page compression is very important. Today, Shanghai SEO Gu Huiming Chat with you, how to compress the Web site, to achieve improved opening speed of the Web page.
We first look at an example, to see NetEase 163 Main Page in Baidu snapshot is what kind of situation.
We first look at NetEase 163 today's snapshot content.
Look at the original content of the Web page.
I don't know, did you find it? NetEase's 163 first page snapshots are much less than the original snapshots. Is this spider crawl wrong? Actually not, we look at NetEase 163 first page source code.
The red one is a CSS, which can be completely compressed using an external call. And NetEase put all this code into the homepage, resulting in the Web page file is too large, spiders can not completely crawl. Spiders generally crawl the page size of around 200KB, this value may not be very accurate, but one thing is certainly true, the Web page as much as possible compression.
Web compression in the SEO perspective, there is no direct impact on the rankings, but when spiders visit the Web site to crawl the page, if the page response time is long, it is bound to affect spiders crawl, after all, spiders crawl on every site time is limited, if the page is too large, resulting in a long response time, is likely to affect the amount of a collection.
So how to compress the Web page? I generally take such two methods.
1. Open the server side of gzip to compress files.
Web pages on the server before transmission, the first use of gzip compression and then transmitted to the client, after the client received by the browser to extract the display, so although a little bit of the server and the client CPU, but in exchange for higher bandwidth utilization, faster loading speed of the web, and for plain text, Compression rate is considerable, generally can compress more than 60%.
2. For Web page HTML, JS, CSS code optimization. This compression needs to understand some basic code. In my opinion, SEO practitioners know some front-end code is still necessary. In general this kind of compression, I use the code of the document type by using the backward compatible ellipsis mode! DOCTYPE HTML instead of a traditional DTD, you can save more than 180 characters for a Web page, as well as a DIV structure that is more code-saving than a table structure, and files pasted from Word or WordPad can also be manipulated by manipulating the source code to clear redundant code.
Said the code I also want to add, Google is very fancy code level, if messy without layering sense, it is very likely to be down the right penalty.