A brief analysis of Arraybuffer

Source: Internet
Author: User

Key technologies: Javascript,arraybuffer,type Array,dataview,web Worker, performance comparisonArrayBuffer

At the beginning of the article listed these keywords, mainly to let everyone understand the main content of this article, if you are not interested in forwarding can go; If you know this piece very well, welcome more advice and more exchanges, if you want this technology at first sight, you might as well sit down to enjoy the fun of reading.

First, why do web developers need to constantly optimize the transfer of data? Because the data is the core of the application, because this piece directly determines the user experience of good and bad, and the user's nature is greedy. The demand of users with their own satisfaction is constantly expanding, often lead to this kind of frenzied demand: "C/S under the effect (function), b/s under why not?" ”。 Before you can smile, and then a pair of nothing please hang up the expression, but with the popularization of HTML5 standards, MA technically really feasible. HTML5 provides canvas,webgl,websocket, audio and video, and so many other functions, is completely a set of browser-based operating system API. This is a big achievement, the impact is huge, even adobe is fully embracing HTML5, every web developer to keep up with the pace of the times.

No matter what function you use HTML5, data are the core of the problem, especially in the big data era, we need to use a new look at the data, and with the maturity of hardware, especially the rich HTML5 function, a lot of previous experience can not be done now, which also directly led to the data needs become larger. such as audio and video, or three-dimensional model, tens of thousands of data transmission, if still using the traditional json,xml this form, the amount of data slightly larger embarrassing task, this problem can not be avoided. So, how to solve this big data transmission performance? The answer is simple, learn from CS! 1. Create and read and write

Traditional CS files are basically binary format, plus zip compression, short and capable, system IO processing ability, so in the case of large amounts of data can also be competent. Initially in WebGL there are similar requirements, JS and video cards in a large number of real-time data exchange, and data communication must be binary, JavaScript also need such an efficient way to access the binary, the resulting typed array.

Arraybuffer itself is a piece of memory, for users to read and write, the use of the same simple way:

 //  Create 16 bytes of memory   var  buffer = new  ArrayBuffer (16 //  var  int32view = new   Int32array (buffer);  //  This time is 4:4 int32 type, then 4*4 = 16 bytes  for  (var  i=0; i<int32view.length; i++) {int32view[ I]  = i; // } 

You can see that the usage is similar, but it allows the user to implement byte-level processing power. Of course, new is not our focus, but the focus is on how to use Arraybuffer in XMLHttpRequest requests, and how the server can be transferred in binary mode.

var function (URL, headers) {return          loadwithxhr ({        url:url,        //  tell server, return type with Arraybuffer        Responsetype: ' Arraybuffer ',         headers:headers    }) ;

OK, it can be imagined that the same information under the binary is more compact. The following is the size comparison under the same data, which can be roughly considered to be four times times the size of the two.

2. Data analysis

The following question comes up, binary files, looks very stressful? It is indeed a question. "Unix Programming Art" there will be a sentence: "If you want to create a new binary format, then you should sleep, and the next day to get up and think about whether it is necessary to do so." "This is also the Web developers have to face the problem, if the JSON has not been able to meet your needs, like C/s developers on the binary to the heart, who did not say that the binary is the C/S developer's exclusive, go a little more than the road. Of course, JS also provides a way to read and write Arraybuffer.

There are two ways, one is Dateview and the other is type Array.

Dateview API

Type array specific types

Is the difference between the two styles, strictly speaking, full use of one can also be resolved, the difference is that the former mainly provides the form of functions, and the latter is mainly in the form of variables. Personal experience is better with the effect, one is Xiaojiabiyu, one is ladylike, each has a good ah. A continuous piece of data, such as VBO, uses Typearray to directly correspond to the float type, whereas for a structure with multiple attribute variables, it can be parsed by DataView order. Well, it's all about feeling, the code below, and finding out for yourself.

var pos = 0; var New DataView (buffer); var true  + = float32array.bytes_per_element; var true  + = uint32array.bytes_per_element; var New Uint16array (buffer, POS, Vertexcount * 3);

As above is the code for a practical application, DataView encapsulates buffer, and then provides the basic function GetFloat32, getUint32 to implement successive reads of the variable. At the same time for vertexbuffer such a large block type is used Uint16array direct access.

It can be seen that binary parsing is the key to the binary format of the clear, and the sense of parsing the binary complex, mainly to overcome the psychological role.

Here are two places to emphasize, the first is to promote the use of bytes_per_element, each type array will have this property to record the length, in case the variable length changes, and your code is dead (probability is 0), you can not cry too late. Maybe Obsessive compulsive disorder, think so good. The other is to pay attention to the parameters in the Uint16array constructor, where POS is a byte unit, while the unit of Vertexcount is Uint16, two bytes, the units of the two are different, how much they want to move themselves, must be handled with care.

Performance comparison of operator operators for different array types

Comparison of Read and write operations under IE

Comparison of Read and write operations under Chrome

Above is the performance comparison chart under my Notebook (Create,read&wirte), the Arraybuffer is created almost four times times more than the array, the reading operation is one times faster, but the operation of the array is very fast; In addition to different types under byte, The difference between int is not big, and IE is slow to become a ghost compared to Chrome. It seems that different configurations, different browser differences are very large. It seems to have the ability or to see the JS engine implementation, there are a lot of knowledge can be a place to rise.

Again, less common, but also a very good way to use. IMG Label form, sometimes for various reasons, the binary information as a picture of the pixel storage, so through the IMG tag to transfer, convenient, and there is a certain degree of encryption, corresponding to the imagedata of the canvas. But in the client needs an IMG to the type array of a process, the idea is not troublesome, through the imagedata to do the middle excessive:

// Suppose it's an IMG image sent over by the server. var New Image; // draw the picture onto the canvas context.drawimage (imginfo,0,0,width,height); // get the pixels inside the canvas var imgdata = Context.getimagedata (0,0, width,height); // It is Uint8clampedarray var typearray = Imgdata.data

In addition, the binary problem is not so simple, there are byte-size end and byte alignment issues. The concept of byte size end you can Google check, not in this many words, DataView provides parameters, the default is low-byte sorting. In the case of byte alignment, the length you declare in Uint16array must be an integer multiple of the length of the type byte, such as Uint16 is two bytes, then the length is divisible by 2, otherwise the browser will alert. 3. Data processing: Web Worker

The interesting part is that JavaScript supports asynchrony, but is inherently a single-threaded environment, and we used settimeout methods to simulate real-time in the past. For CS developers, multithreading is an effective means of dealing with big data. For example, when the amount of data is very large, how to avoid the UI response at the same time while processing data, usually we have to open up a worker thread to process the information, the processed data is placed in the shared pool, the main thread of the UI directly use the data, to ensure the smooth interface response, and JavaScript is powerless , even if the use of Ajax can only be partially updated, but "seems to have a response, but the overall time is still unchanged, even slower", HTML5 provides a web worker multithreading mechanism, it can be a good solution to this problem.

Why should mention web worker, because often data parsing, then will enter the process of processing, such as the analysis of the data to build a triangular network, or data decompression, decoding and other operations, if placed on the main thread process is always not perfect solution, so naturally think of using the worker thread Web Worker to handle. and the current WebGL version of Google Earth is also working with workers to process data, and Baidu's 3D map has not yet, in-depth research will find a lot of technical interesting differences. This piece will be introduced in detail later, because it is also related to the data, here is only open the head involved.

The following example is relatively simple, but the personal feeling really want to realize the function or there are many restrictions, design also has a lot of skills, so also do not say, multithreading or have to do more to accumulate experience, give the following this simple example, let everyone have a simple understanding.

Main script:

var New Worker (' dowork.js '); Worker.addeventlistener (function(e) {     console.log (' worker said: 'false); Worker.postmessage (//  pass data to worker threads. )

Dowork.js (Worker):

function (e) {    false);
4. Data rendering

This is not related to this section, but in order to illustrate the process of a data from start to finish, so add in it. WebGL hardware acceleration, direct use of the video card batch rendering, is the only way I know of big data rendering, because there is no research on high-performance rendering for other big data, and here is only one idea for WebGL. 5. Other

1. Asynchronous

JS in the data is generally on the server, the transmission of data is asynchronous, unlike CS in most cases are directly loaded locally, so the complexity of the scheduling will increase, and the number of browser TCP connections are also limited, so the number of simultaneous requests should be controlled, the server network card bandwidth is a bottleneck, through the cross-domain, Multiple IPs to increase the amount of data downloaded at the same time, so you may also use zip compression to improve the reuse of browser cache, to consider a lot of points, practicality is also very strong. So it should be considered at design time. Encapsulating a reasonable primise pattern can increase the readability of the code.

2. Data security

Although the JS code can be confusing, it can be debugged on the client side. In other words, you can still get your data format by reading your JS source. and data are often the core, binary data in many cases do not want to let users know the structure inside, but unfortunately, this in the JS technology itself can not keep the data confidential, so only a different way. Personally feel that there are two possible, one is the service side of the authorization, token of the way. The other is to add some redundant information to the data as a unique sign of their own data, which is the copyright evidence if someone else steals the data. For example, the map vendors often add some unique non-existent location points on the map, if other manufacturers use, indicating that they did not investigate the authenticity of the direct misappropriation of data, this is an evidence: summary

HTML5 has a lot of good features, the web development is also a great challenge, but purely technically is full of temptation, and in the big data age, in fact these are very good B/s Big data on the solution of ideas and basic technology. I do not have any research on big data, with my existing superficial understanding, I think a Web application without these binary, hardware acceleration technology applications, can not be called Big Data technology, at best, but some of the most strategic optimization, time and space, server and client balance, Instead of solving the problem in terms of quality.


This reminds me of the picture above (misappropriation of the classmate company). Don't easily deny what you think is impossible. Technological innovation will always fill some kind of non-existence.

The following is the highest free fall skydiving world record holder (yes, a Google executive who is in a spacesuit skydiving, faster than the speed of sound) faced with a question raised by his daughter, three different responses:

It is impossible.

Even it's not impossible, it's very very hard.

Well, maybe it isn't hard, it's just I do not know.

A brief analysis of Arraybuffer

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.