Google releasedWebpage Statistics ReportFrom the 4.2 billion webpages indexed by them (websites with high PageRank may have a higher weight ):
* Parse prevents Google crawlers from requesting CSS and JS script files.
* Only 2/3 of the compressed content is actually compressed. It is also worth noting that some websites provide Compressed Content for real browsers, But what Google crawlers see is uncompressed.
* 80% of pages contain 10 or more clips from the same host.
* Most popular websites do not write scripts and CSS stored on the same host, resulting in 8 more HTTP requests.
* Each page contains 29.39 images, which are 205.99kb together.
* Each webpage contains 7.09 external scripts and 3.22 external CSS. The average script size is 57.98kb, And the CSS size is 18.72kb.
* Only 17 million webpages use SSL encryption, accounting for 0.4% of the total
* The average webpage loading speed is 4.9 seconds, and 49 different sources need to be requested
Google's move aims to increase the importance that everyone attaches to the Speed Optimization of webpage reading. They provide webpage developers with a lot of help guides to improve their efficiency. Google provides four main suggestions:
- Use gzip to compress the page
- Use http Cache
- Optimize JavaScript code
- Merge scripts and CSS