Compression to make the network faster

Source: Internet
Author: User

On the Internet, over 99 people are wasted every day on downloading uncompressed content. Although the support for compression is already the standard feature of every modern browser, their users are often unable to download the compressed data for that reason. In this way, the bandwidth is wasted and the interaction between users and webpages is slowed down.

Data Compression is harmful to all users. For narrowband users, it takes more time to download the excess data. For broadband users, although data transmission is very fast, before entering the highest possible speed of data transmission, non-compressed data also requires more inter-network round-trips (IP packets) (emu Note: This indicates that when a Broadband User accesses a multimedia website and needs to download resources of webpages, css, and scripts before downloading multimedia content at high speed.) in this case, the number of round trips between networks (the number of IP packets) is a factor that affects the speed of a complete page. Even if the network conditions are very good, the extra round-trip between networks often takes dozens or even hundreds of milliseconds.

In Steve Souders's Even Faster Web Sites, Tony Gentilcore uses data to show how the page loading time increases when compression is disabled. After obtaining the permission, we also reproduced the speed test results of the three top Alexa websites:

 

Website
Alexa ranking
Total increase in download time (when the first download is started)
Page loading time growth
(1000/384 Kbps DSL) (broadband)
Page loading time growth
(56 Kbps modem) (narrowband)
 
Http://www.google.com/
1
10.3KB (44%)
0.12 seconds (12%)
1.3 seconds (25%)
 
Http://www.facebook.com/
2
348 KB (175%)
9.4 seconds (414%)
63 seconds (524%)
 
Http://www.yahoo.com/
3
331 KB (126%)
1.2 seconds (64%)
9.4 seconds (137%)
 

The data comes from "Chapter 1: Beyond Gzip compression" in Steve Souders's Even Faster Web Sites, with the author's permission.

 

Google's web Search logs also show that users who download uncompressed data spend 25% more time loading pages than users who download compressed data. In a random test, we forcibly pushed compressed data to some (claimed) users who did not accept the compressed data. As a result, we measured that their page latency increased by 300 milliseconds. However, this experiment cannot completely explain the problem, because some of these users who are forced to push compressed data may be mistaken, because they may actually use older (not compressed) software on older computers (more may not ).

Why do they not support compression?

We have found that users cannot accept compressed content due to four common causes: anti-virus software, browser defects, network proxy and server configuration errors. The first three types of Web requests have an impact on the network. As a result, the network server does not know that the browser can actually extract the content, especially if they are incorrect, the http header that the browser should have sent to the server in each request is removed or damaged.

Anti-virus software may intercept and tamper with network requests to reduce cpu usage, in this way, the server will send non-compressed data to the client (in this way, they do not need to be decompressed before scanning, but can directly detect the virus ). However, if the CPU is the performance bottleneck of the system, the anti-virus software is not doing this, but is doing nothing. Some famous anti-virus software conflicts with network compression. On the browser compression support test page on Browserscope.org, netizens can verify whether their anti-virus software conflicts with network compression.

By default, IE6 will downgrade the communication protocol to HTTP/1.0 when accessing the network through the proxy server. The result is that it will not send an Accept-Encoding request header. The following table is generated from Google's Web Search logs. It is displayed that IE6 searches account for 36% of all searches with "undeclared accept compression results. This proportion is higher than the actual use ratio of IE6.

 

 

Browser
Proportion in search results that requires no compression
Proportion of all searches that do not support Compression
 
Google Chrome
1
1
 
Safari
1
1
 
Firefox 3.5
3
4
 
Internet Explorer 8
6
5
 
Firefox 3.0
6
7
 
Other
46
22
 
Internet Explorer 7
7
24
 
Internet Explorer 6
20
36
 

Data from Google search logs

There are also a small number of ISPs whose proportion of uncompressed content (unclaimed to accept compressed requests) exceeds 95%. It seems reasonable to assume that these ISPs or company agents have removed or tampered with the Accept-Encoding HTTP header. Like anti-virus software, netizens who suspect that their ISP and network compression are in conflict can go to the browser compression support test page on Browserscope.org to verify it.

There is another case where the user downloads uncompressed content because the accessed website does not compress the content at all. The following table shows several popular sites that do not compress content. If these websites compress their content, they can reduce the page loading time for each visitor by hundreds of milliseconds on average, making the impact on those narrowband users more obvious.

Number of bytes that can be compressed by Website Resource Type
Http://www.cnn.com/CSS and JavaScript 330 kB
40 kB http://www.twitter.com/CSS and JavaScript
Http://www.bbc.co.uk/CSS and JavaScript 201 kB

 

Use Page Speed to generate data

What should I do?

To reduce uncompressed data, we need to work together

· The Company's IT department and independent individual users can upgrade their browsers, especially those who use IE6 to access the Internet through proxy servers. Using the latest Firefox, Internet Explorer, Opera, Safari, or Google Chrome version can increase the chance of downloading compressed data. A recent IEEE analysis Journal Editorial list more reasons-except compression-for upgrading IE6 browsers.

· Anti-virus software vendors can correctly solve the compression problem, and stop tampering and delete the HTTP header Accept-Encoding in subsequent releases.

· ISPs who use the http proxy and tamper or strip the HTTP header Accept-Encoding, you can upgrade, reconfigure, or install a proxy server that does not allow users to use the compression function.

· Network administrators can use Page Speed (or other similar tools) to check whether their webpage content is compressed.

 

This article from the CSDN blog, reproduced please indicate the source: http://blog.csdn.net/emu/archive/2010/02/21/5314850.aspx

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.