IPHONE5 official website Analysis of how Apple uses design elements

Source: Internet
Author: User
Tags base64
Summary:As we all know, IPhone5 was released on September 13, its performance, systems, applications and hardware are analyzed almost, but how many people noticed the IPhone5 website design and implementation. In fact, every time Apple releases a new product will use the web to achieve its special performance, this article is not only on the IPhone5 Web site for the implementation of specific effects analysis.

We all know that IPhone5 released on September 12, the new product will be less of the site. Every time Apple releases a new feature, it always makes a Web implementation on the site, such as IPhone4 's Retina Magnifier, the rotating battery life clock, the new ipad's bizarre scrolling effect (ugh), and so on. The new iPhone5 also shows the new features on the site, but the principle is more complex than it looks:

The IPhone5 on the Web page automatically plays the video when the device unlocks it, but it's strange: there are no <video> elements on the page, only <canvas>. Even if you use Chrome to check the "Network" tab of the Web Inspector, it's also impossible to find any video, but you'll see this strange JPEG image:

Why Apple would do such a ridiculous thing. It is not only involved in the development of h.262 standards, but also the development of safari, there is no reason not to know such a "shortcut": the use of <video> elements.

This is because the,<video> element has to be played full screen on the iphone, which destroys Apple's intention to play inline video; On the desktop side, Apple's web site needs to be compatible with all major browsers, But Firefox and Opera do not support H.264 (again, Apple is unlikely to step back using open source WEBM or Theora).

You can take a look at this retina Macbook Pro "Features" page-2 seconds of display video requires 5M of JPEG images (and a large number of independent HTTP requests). and IPhone5 's Web page, Apple used a new method to avoid requesting pictures individually for each frame. New animations require very little HTTP requests and 1M of traffic, significantly improving performance compared to older solutions.

Here is the encoded "video file": Http://www.apple.com/iphone/design/images/unlock/unlock_manifest.json Http://www.apple.com/iphone /design/images/unlock/unlock_keyframe.jpg http://www.apple.com/iphone/design/images/unlock/unlock_001.jpg http:/ /www.apple.com/iphone/design/images/unlock/unlock_002.jpg http://www.apple.com/iphone/design/images/unlock/ Unlock_endframe.jpg

The logical implementation of this video relies on ac_flow.js, and I recommend that you open "pretty Print" in the Chrome Web inspector (click "{}" in the lower left corner) to read it carefully. I'm not going to do a detailed code explanation here, so I suggest you take a closer look at the code.

The video compression method used by Apple only loads the updated part of the frame: "Unlock_001.jpg" and "Unlock_002.jpg" store the updated part of the picture, "Unlock_manifest.json" The file describes how the update section should be placed. Here is a fragment of "Unlock_manifest.json":

JPEG files are encoded using 8x8 macros, so Apple wisely uses the same size ("Blocksize:8" in JSON), which avoids the problem of the same block being misused by unrelated images. (JPEG pictures use 4:4:4 chroma sampling, so its chroma macro block should also be 8x8.) )

"Imagesrequired" indicates that at the beginning of the animation you need to load two pictures--"unlock_001.jpg" and "unlock_002.jpg", which are treated as continuous 8x8 block flows from left to right and from top to bottom. (If I'm referring to the JPEG stream below, that's it.) )

These "frame" looks like Base64, but in fact the data is encoded according to the Base64 offset. In other words, the 1Byte is actually 6bit.

The following is a fragment of the code that Apple decodes base64:

Each frame consists of a 5-byte instruction, the first 3-byte encoding position of the instruction to update the corresponding part of the <canvas>. The latter 2 bytes contain the number of blocks that need to be read. For example, the first instruction "Aaxac" of the first frame means: Read 2 blocks ("AC") from the JPEG stream, remember-this is the Base64 encoded byte, and draw at the <canvas> position ("Aax").

Note: There is no reuse of JPEG blocks from the JPEG stream, and JPEG blocks are only used once per request, which can cause JPEG potential redundancy problems, but also ensure a smaller manifest and simpler format.

In addition to icons, the "longest" frame in the figure above is the animated part, which results in a white space between block replications in the JPEG stream. That's how Apple encodes video into a JPEG image, but not just video.

Apple also used the compression method to QTVR (virtual reality)--when you drag the headset in the picture, you're actually playing the video of the headset turning, which uses the same js/json/<canvas> compression technique.

How it works. It doesn't seem like much.

What's the problem. The <canvas> API ("DrawImage") takes very little time and most of the time is wasted on decoding frame and apply diff.

It turns out that looking for feature-specific time in such a video, because decoding a separate video frame, first needs to decode each individual frame--after all, the goal of this format is just to decode the modified part, which means now we need to compute a position that has not been computed.

To alleviate this problem, Apple offers both forward and backward video playback modes (otherwise, the reverse rotation of the ear is very slow). Unfortunately, this is a direct result of doubling the size of the file, but does not solve the problem of "jump frame" when users quickly drag and drop headphones.

What Apple will do next. Replace JSON with something more appropriate for manipulating binary files.

It seems Apple is trying to replace the Base64 string in the current JSON with manifest encoded in a PNG image in a new version. And because the PNG format is lossless, and <canvas> supports it very well, it is ideal for encoding many bytes of data, even if it is not a picture. I am very interested in the impact of the PNG format on performance and the change in the size of the animation file.

Original link: IPhone 5 website teardown:how Apple compresses video using JPEG, JSON, and <canvas>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.