This is a simple article about some tips for using JavaScript arrays. We will use different methods to combine two JS arrays and discuss the advantages and disadvantages of each method. Let's first consider the following situation: this is a simple article about some tips for using JavaScript arrays. We will use different methods to combine/merge two JS arrays and discuss the advantages/disadvantages of each method.
Let's first consider the following situation:
var a = [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ];var b = [ "foo", "bar", "baz", "bam", "bun", "fun" ];
Obviously, the simplest combination result should be:
[ 1, 2, 3, 4, 5, 6, 7, 8, 9, "foo", "bar", "baz", "bam" "bun", "fun"]
Concat (..)
This is the most common practice:
var c = a.concat( b );a; // [1,2,3,4,5,6,7,8,9]b; // ["foo","bar","baz","bam","bun","fun"]c; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
As you can see, C is a brand new array, indicating the combination of arrays A and B, and leaving a and B unchanged. Simple?
But if a has 10,000 elements, and B also has 10 thousand elements? C will have 20 thousand elements, so the memory usage of a and B will double.
"No problem !", You said. Let them be recycled and set A and B to null. The problem is solved!
A = B = null; // 'A' and 'B' are recycled.
Haha. There is no problem with small arrays with only a few elements. But for large arrays or systems with limited memory, there are still many improvements to this process.
Loop insert
Okay, let's copy the content of one Array to another and use: Array # push (..)
// `b` onto `a`for (var i=0; i < b.length; i++) { a.push( b[i] );}a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]b = null;
Now, array a has the content of array B.
It seems that memory usage is better.
But if array a is small? For memory and speed reasons, you may need to put a smaller one in front of B ,. No problem. You only need to replace push (...) with unshift:
// `a` into `b`:for (var i=a.length-1; i >= 0; i--) { b.unshift( a[i] );}b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Functions and skills
However, the for loop is ugly and difficult to maintain. Can we do better?
This is our first attempt to use Array # reduce:
// `b` onto `a`:a = b.reduce( function(coll,item){ coll.push( item ); return coll;}, a );a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]// or `a` into `b`:b = a.reduceRight( function(coll,item){ coll.unshift( item ); return coll;}, b );b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Array # reduce (...) and Array # reduceRight (...) are good, but they are a little clumsy. ES6 => the arrow function reduces the amount of code, but it still requires a function, each element needs to be called once, not perfect.
How about this:
// `b` onto `a`: a.push.apply( a, b ); a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"] // or `a` into `b`: b.unshift.apply( b, a ); b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Is it much better? This is especially because the unshift (...) method does not need to worry about reverse sorting. ES6 spead operations will be more beautiful: a. push (... B) or B. unshift (...
Maximum length of Array
The first major problem is that the memory usage doubles (of course, it's only temporary !) The appended content is basically copied to the stack through function calls. In addition, different JS engines have restrictions on the data copy length.
Therefore, if the array has 1 million elements, you will definitely exceed the limit that push (...) or unshift (...) can call the stack. Alas, processing thousands of elements will do well, but you must be careful not to exceed the reasonable length limit.
Note: You can try splice (...), which is the same as push (...) and unshift.
There is one way to avoid this maximum length limit.
function combineInto(a,b) { var len = a.length; for (var i=0; i < len; i=i+5000) { b.unshift.apply( b, a.slice( i, i+5000 ) ); }}