160830, How to use the latest technology to improve the speed and performance of Web pages

Source: Internet
Author: User
Tags setcookie webp blank page browser cache http strict transport security website performance webpagetest

We have recently updated our website and it has been completely checked and accepted by DESIGN. But in fact, as a software developer, we focus on a lot of technology-related bits and Pieces. Our goal is to control performance, pay attention to performance, future extendable, add content to the site is a pleasure. Then i'll tell you why our website is faster than yours (i'm sorry, It's true).

Performance Design

In our projects, we discuss with designers and product owners every day about balancing aesthetics and performance. For our own website, this is very simple. In short, we think that a good user experience starts with a fast content transfer, which means 性能 > 美观 .

Good content, layout, pictures, and interactions are important factors in attracting users. Each of these factors affects the load time of the page and the end-user Experience. Every step of the way, we are exploring how to minimize the impact on performance while gaining a good user experience and ensuring design aesthetics.

Content first

We want to present the core content to the user as soon as possible, which means we have to deal with basic HTML and CSS. Each page should achieve its basic purpose: to pass Information. JS, CSS, Web page fonts, pictures, website analysis and other optimization are in the core Content.

Controllability

To the ideal site defined standards, We summed up: to achieve the desired results, you will be able to control all aspects of the Site. We chose to build our own static site builder, including the transfer of resources, and we manipulated it ourselves.

Static Site Builder

We implemented the static Site builder with Node. Js. It uses a file with JSON a short page description tag Markdown to generate the entire site structure and all of its resources. To include special page scripts, You can also attach a HTML file. The following is a simplistic description of the tags and markdown files that are used to publish the blog and use it to generate real HTML .

JSONDescription Label:

{

 "keywords": ["performance", "critical rendering path", "static site", "..."],

 "publishDate": "2016-07-13",

 "authors": ["Declan"]

}

Markdown File:

# Why our website is faster than yours

We‘ve recently updated our site. Yes, it has a complete...

 

## Design for performance

In our projects we have daily discussions...

Picture Transfer

The average one 2406kb of the pages 1535kb is a picture. Because pictures occupy such a large proportion of the site, it is also one of the priorities of performance Optimization.

WEBP format

WEBP is a modern image format that provides excellent low-loss, lossy compression for Web Images. Images in WEBP format are substantially smaller than other formats and can sometimes be 25% smaller than the same JPEG Images. WEBP is ignored by most people and is not used Frequently. By the time this article was written, WebP only supported chrome, Opera and Android (still more than 50% of our users), but we were able to downgrade gracefully to Jpg/png.

Using <picture> elements we can gracefully downgrade images from WebP to other widely supported image formats, such as Jpeg:

<picture>

 <source type="image/webp" srcset="image-l.webp" media="(min-width: 640px)">

 <source type="image/webp" srcset="image-m.webp" media="(min-width: 320px)">

 <source type="image/webp" srcset="image-s.webp">

 <source srcset="image-l.jpg" media="(min-width: 640px)">

 <source srcset="image-m.jpg" media="(min-width: 320px)">

 <source srcset="image-s.jpg">

  alt="Description of the image" src="image-l.jpg">

</picture>

We use Scott Jehl picturefill to get support for browsers that don't support <picture> elements and achieve consistent results across Browsers.

We use a fallback element for browsers that do not support <picture> or JS. Using the largest instance of a picture ensures its viability in a fallback scenario.

Generated

Although image transmission has been determined, we still need to think about how to do it effectively. I like <picture> the function of the element, but I don't like to write the code snippet above, especially when I have to add it in Writing. We don't want to do this laborious thing: each picture has to write 6 an instance, so optimize these pictures and write them in the markdown file <picture> . So:

Create a picture

During the build process, multiple instances of the original picture, including the jpg, png, and webp formats, we used gulp responsive to Generate.

Minimize images written in the markdown file [图片描述](image.jpg).

Use the custom Markdown renderer during the build process to <picture> compile a traditional markdown picture declaration for elements that are already fully fledged.

SVG animations

We chose a specific icon type for our site, with SVG illustrations taking the Lead. This is done for several reasons:

    • first, The SVG image is smaller than the bitmap;

    • second, The SVG image itself is responsive, there is great scalability, so do not need to create images and <picture> elements;

    • finally, It is also very important that we can constantly change it through CSS and give it new vitality! All of our portfolio pages have a custom dynamic SVG diagram that can be reused by the overview page. This image, as a looping style for all of our combined pages, makes the page design integrated, with little impact on Performance. Take a look at this animation and see how we can change it with css.

customizing Web page Fonts

Before digging in, Here's A quick introduction to setting up custom fonts in your Browser. When the browser finds a definition inside the CSS @font-face , but the User's computer does not support the font, it tries to download the font file. Most browsers do not use this font to display text at all when Downloading. This phenomenon is called "flicker of invisible text" or FOIT . If you are aware of this, you will find that there is such a situation on the Page. If you ask me, i'll tell you that this affects the user Experience. It delays the time users read the content they want. We can force the browser to change this behavior into "no style content flash" or called FOUT . We tell the browser to use normal fonts, like Arial or Georgia. When the custom font download is complete, replace the standard font and render Again. This will not affect the readability of the content even if the custom font download Fails. however, Some would argue that this is a compromise approach, but we think that custom fonts are just an Optimization. Even if you don't have a custom font, the Web page looks good and works Perfectly. Tick/uncheck the box to toggle the font of our web pages and try it yourself:

Toggle the downloaded Font class
Using custom Web fonts can improve our user experience, as long as you can optimize them and serve them responsibly.

Font subset Settings

So far, subset setting is the quickest way to improve the performance of your Web page fonts. I will recommend it to every web developer who uses a custom Font. If you have complete control over the content of the page and know what features it is going to show, you can use subset settings entirely. however, even if you just set the font to Western languages, it can have a big impact on file Size. For example, our Noto Regular WOFF font, which defaults to 246KB, is set to a western language, and the size drops to 31KB. We use Font squirrel webfont this font really easy.

Font Listener

The font listener introduced by Bram Stein is a great script that helps to check if the font has been Loaded. As for how you load the font, it is through a Web page font service, or upload it yourself can not be a CICADA. After the listener tells us that all the custom fonts have been downloaded, we can add a font-loaded class to the element and our page will re-use the new font:

html {

  font-family: Georgia, serif;

}

 

html.fonts-loaded {

  font-family: Noto, Georgia, serif;

}

Note: for the sake of brevity, I have not given Noto the added statement in the CSS above @font-face .

We can set a cookie to remember that all the fonts have been loaded, so that they can be cached in the Browser. We use this cookie to do repeated browsing, which I will explain later.

In the near future, we may not need to Bram Stein's feet to monitor this Behavior. The CSS development team has proposed a new @font-face descriptor, also called font-display . Its property value controls how a downloadable font renders the page when it is not loaded. This is a description of the CSS pair font-display : it will give us the same behavior as the above Method. You can read more about font-display the Properties.

JS and CSS Lazy loading

In general, we are loading the required resources as quickly as Possible. We removed some blocked requests to speed up page rendering, optimize the first screen, and use the browser cache to handle duplicate Pages.

JS Lazy Loading

design, Our site does not have a lot of js. We have developed a JavaScript workflow to handle our existing js, and the future use of JS Resources.

JS in the block to render, this is what we want. JS should only be used to improve the user experience, should not be the key to the needs of Visitors. The simple way to handle JS blocking rendering is to put the script at the end of the Page. This way the page will be loaded after the entire HTML rendering is Finished.

Another scenario where you can put a script in head execution is <script> to delay the execution of the script by adding attributes to the tag defer . Because the browser download script is very fast, does not block the page rendering process, wait until the page fully loaded, will execute the script inside the Code. One more thing, we don't use libraries like jquery, so our script depends on vanilla the nature of the script. We just want to load the script in the browser to support these Features. The end result is the following code, like This:

<script>

if (‘querySelector‘ in document && ‘addEventListener‘ in window) {

 document.write(‘<script src="index.js" defer><\/script>‘);

}

</script>

We put this little piece of script on the head of the page to detect if the browser supports native JavaScript document.querySelector and window.addEventListener Attributes. If supported, we <script> load the script directly to the page via a tag and use the defer property to make it not clog the page rendering.

Lazy Loading CSS

For the first screen, the Site's largest rendering blocking resources is the CSS. The browser will only start rendering the page when the CSS file inside is fully loaded. This approach is well thought out, and if not, the browser needs to continually recalculate the layout dimensions throughout the rendering process and redraw them continuously.

To prevent CSS blocking rendering, we need to load the CSS file asynchronously. We used Filament Group the awesome loadCSS Function. This function provides a callback, and when the CSS file is loaded, we set a cookie to declare that the CSS file has been Loaded. We use this cookie to reproduce the page, which I will explain later.

CSS asynchronous loading also brings up the problem, although HTML is really very quickly rendered, but it looks like only pure html, only when the entire CSS file is loaded and stopped, there will be style. This is 关键CSS the time to mention it.

Key CSS

The key CSS is blocking the browser from rendering a small subset of the CSS that the user can Recognize. We pay attention to the page 上半版版面 . obviously, there is a big difference between the layout of the two Devices. So we made a bold guess.

Manually detecting this part of the critical CSS is a time-consuming process, especially when styles, features, and so on are constantly changing. Here are a few good scripts that can generate critical CSS when you build a Web Page. We have adopted Addy Osmani the Version.

Here are the page effects we render separately using the key CSS and the entire css. Note that the next half of the version still has some content that does not have a style.

The left-hand page is rendered with a key css, while the right-hand page is a full-copy css. Red lines are the dividing Line.

Service side

We deploy our own de Voorhoede Web site because we want to be able to control the server Environment. We also want to try and see if we can dramatically improve performance by changing the configuration on the server Side. Currently we use the Apache service and HTTPS Protocol.

Configuration

To improve performance and security, we studied how to configure the server Side.

We use the H5BP Apache template configuration, which is a good start to improve the performance and security of Apache network Services. They also have configurations for other server Environments. For most of our HTML, CSS, and JS, we use gzip Compression. For all of our site resources, we use the practice of caching HTTP Headers. Interested please read the section on File-level caching.

HTTPS

Using HTTPS to service your site can have a performance impact. The main thing is to set up an SSL handshake that introduces a lot of potential stuff. But usually, we can make some changes!

HTTP Strict Transport Securityis an HTTP header that lets the server tell the browser to interact with it only with https. This way, the HTTP request is prevented from being redirected to Https. All requests to access the site using HTTP will be automatically converted to Https. This saves a round trip.

TLS false startAllows the client to immediately send encrypted data back to the end of the first TLS round. This optimization reduces the number of handshakes for a new TLS Connection. Once the client knows the key, it can begin transmitting the application Data. The remaining handshake is used to confirm if someone tampered with the handshake record and can be processed in parallel.

TLS session resumptionSave us a trip by confirming that the browser and server have been Contacted. The browser remembers this one time identifier, and the next time the connection is initiated, the identifier is reused, saving a back and Forth.

I sound like a developer and ops, but it's Not. I've just read some books and I've seen some videos. I like Google I/O 2016 Mythbusting HTTPS : Emily Stark the security of urban LEGENDS.

Use of cookies

We do not use a service-side language, only the static Apache network Service. But an Apache Web service can still do back-end services, including ssi, and read Cookies. By skillfully using cookies and running portions of HTML that were rewritten by apache, we can dramatically improve front-end performance. Here's An example (our actual code is more complicated than this, but the thinking is consistent):

<!-- #if expr="($HTTP_COOKIE!=/css-loaded/) || ($HTTP_COOKIE=/.*css-loaded=([^;]+);?.*/ && ${1} != ‘0d82f.css‘ )"-->

 

<noscript><link rel="stylesheet" href="0d82f.css"></noscript>

<script>

(function() {

   function loadCSS(url) {...}

   function onloadCSS(stylesheet, callback) {...}

   function setCookie(name, value, expInDays) {...}

 

   var stylesheet = loadCSS(‘0d82f.css‘);

   onloadCSS(stylesheet, function() {

       setCookie(‘css-loaded‘, ‘0d82f‘, 100);

   });

}());

</script>

 

<style>/* Critical CSS here */</style>

 

<!-- #else -->

<link rel="stylesheet" href="0d82f.css">

<!-- #endif -->

Apache service-side logic looks like a line-by-line comment, typically to <!-- # begin With. Let's take a step-by-step look at it:

$HTTP_COOKIE!=/css-loaded/Detects if a CSS cache cookie Exists.
$HTTP_COOKIE=/.*css-loaded=([^;]+);?.*/ && ${1} != ‘0d82f.css‘The version that detects the cache is not the current version you Want.

If <!-- #if expr="..." -->If it is true, we assume that this is the first time the user has Browsed.

The first time we added a <noscript> tag, it was placed inside <link rel="stylesheet"> . This is done because we want to load the entire CSS and JS asynchronously. If JS is not available, this practice cannot be performed. This means that we have to do a fallback using the usual method of loading css.

We added a inline script to lazy load css, onloadCSS callback inside can set Cookies.

In the same script, we loaded the entire CSS Asynchronously.

In onloadCSS the callback, we use the version number to set the value of the Cookie.

After this script, we added a line of key CSS Styles. This will block the rendering, but the time is very short, and can avoid the page display only pure HTML and no Style.

<!-- #else -->css-loadedthe user repeatedly browses the statement (meaning that the cookie already exists). Because we can assume to some extent that the CSS file has been loaded before, we can use the browser cache to provide the style Sheet. This way, loading from the cache is Fast. The same method is used to load the font asynchronously for the first time, and subsequent repeated browsing also gets the font from the Cache.

This is our first and repeated browsing when we used to differentiate the Cookies.

File-level Caching

Since we rely heavily on the browser cache to reproduce the page, we need to verify that our cache is Reasonable. Ideally we would like to store resources Forever (CSS, js, fonts, pictures), only need to be updated when these files are Modified. When the requested URL is unique, the cache is Invalidated. we'll use a label for each version we update git tag . So the simplest way is to add a parameter (code version number) to the URL we requested, such as https://www.voorhoede.nl/assets/css/main-8af99277a6.css?v=1.0.4. .

however, The disadvantage of this approach is that when we are going to write a new blog post (which is also part of our code base and is not permanently stored in the cms), the original cached resources will be invalidated, although the original resources are not changed.

As we try to improve this approach, we find the gulp-rev and gulp-rev-replace . These scripts will automatically add a hash value after our file Name. This means that the requested URL is changed only when the file is actually changed, so that the cache for each file is automatically Invalidated. I am so excited by this practice!

Results

If you see this, you should want to know the Result. Testing the performance of a Web page can take a PageSpeed Insights tool like this, it has a very useful hint. can also be used WebPagetest to test, with extensibility for network Analysis. I think the best way to test Web page rendering performance is to watch the Web page process as it frantically curbs network traffic. This means that there is an unrealistic way to curb Communication. In Google chrome, you can do this (via the Inspector > Network tab) to restrict communication, observing how slowly the request is loaded during web page Formation.

Below is the loading status of our web page in the case of 50kb/s.

This is the de Voorhoede site first screen of the network analysis, the Web page is an overview of the first process.

Notice how we made the first screen render only for 2.27 seconds in the 50kb/s Speed. The position represented by the Yellow line in the first slide and the waterfall Chart. The Yellow line is exactly where the HTML has been Loaded. HTML contains key CSS to ensure the objectivity of the Page. All other CSS is loaded lazily, so we can wait until all the resources have been loaded to interact with the Page. This is the effect we want!

Another notable thing is that a custom font never loads on this slow link. font facethe viewer will automatically notice This. however, If we do not load the font asynchronously, you gaze at most browsers, which will appear FOIT (flashes of invisible text, mentioned above).

All CSS files are loaded only after 8s. conversely, If we don't take the method of loading the key css, instead of loading all the css, we'll see a blank page in the first 8 seconds.

If you're curious and want to compare load times with sites that don't pay much attention to performance, try it Out. That time must have been soaring!

We tested our website with the tools described above, and the results were satisfying. It PageSpeed insights gives us 100 points in terms of mobile performance, how amazing!

PageSpeed insightsThe voorhoede.nl test results of the pair! speed 100 points!

When we look at it WebPagetest , we get the following result:


Test results of Webpagetest pair voorhoede.nl

As can be seen, our server is running well, the first screen speed indicator is 693. This means that our page can be used under a widescreen cable in 693 MILLISECONDS.

Technical Route

We are not finished yet, and we will continue to repeat our Approach. In the near future, we will mainly focus on the following topics:

Http/2

We are currently experimenting with http/2. Most of the things described in this article are based on best practices within http/1.1 permissions. In short, the http/1.1 will go back to 1999, when both the table layout and the inline style are in full swing. http/1.1 has never been designed to accept 200 requests for a 2.6MB web Page. In order to alleviate the pain caused by the old protocol, we combine js, css, key css, and also set the data source URI for the small picture. Save requests in a variety of ways. Since HTTP/2 can run multiple requests in parallel on the same TCP link, the use and reduction of all of these joins can be a negative pattern. When we run through this experiment, we will adopt the HTTP/2 Protocol.

Service Workers

This is a JavaScript API for modern browsers running in the Background. It has many features that are not available on previous sites, such as offline support, message push, background synchronization, and so On. We are trying to use it now, but we have to do it Service Worker on our own website First. I promise you, we'll do it!

Cdn

therefore, we want to control and deploy our website Ourselves. And now we're going to use a CDN to get rid of the network problems caused by the actual distance between the server and the Client. While our users are basically dutch, we want to reflect to the World's front-end community the best we can do in terms of quality, performance and driving the development of the Web.

Eof

Nuggets is a high-quality technology community, from ECMAScript 6 to vue.js, website performance optimization to open source class library, so you are good at WEB development of every technical dry. Long-click on the image QR code identification or the major application market search "nuggets", Technical dry in the grasp.

160830, How to use the latest technology to improve the speed and performance of Web pages

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.