Original link: https://davidwalsh.name/combining-js-arrays
Link to original translation: http://www.ituring.com.cn/article/497290
This is a short article about JavaScript technology. We will talk about the different strategies for combining/merging two arrays, and the pros and cons of each approach.
First, show the application scenario:
var a = [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ];var b = [ "foo", "bar", "baz", "bam", "bun", "fun" ];
It is obvious that the stitching results in this way:
[ 1, 2, 3, 4, 5, 6, 7, 8, 9, "foo", "bar", "baz", "bam" "bun", "fun"]
Concat (..)
The most common practice is as follows:
var c = a.concat( b );a; // [1,2,3,4,5,6,7,8,9]b; // ["foo","bar","baz","bam","bun","fun"]c; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
As you can see from the above code, c
it is a a
b
completely new array that is merged with two arrays, a
and the and is b
not affected. It's pretty simple, right?
a
What if and b
respectively contains 10,000 elements? Then c
there will be 20,000 elements, the memory used to basically let a
and b
occupy the memory doubled.
"It's no big deal!" "You smiled a little." We can take a
and b
Delete, so that the occupied memory can be recycled, so always OK? The crisis is lifted!
a = b = null; // `a` and `b` can go away now
Oh. For some decimal groups, this is certainly not a problem. However, for large arrays, it is not enough to perform such operations frequently, or in the case of limited execution of the environment memory.
Loop insert
Well, then use the Array#push(..)
method to append the contents of an array to another array:
// `b` onto `a`for (var i=0; i < b.length; i++) { a.push( b[i] );}a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]b = null;
Now, it contains elements that a
are in addition to the elements in the original a
b
.
It seems to be much more effective for memory use.
But a
b
What if it's small and relatively big? For memory utilization and execution speed, you will want to insert the decimal group into a
b
the front instead of appending the large array b
to the a
back. No problem, just use unshift(..)
push(..)
the substitution and the opposite direction to traverse it:
// `a` into `b`:for (var i=a.length-1; i >= 0; i--) { b.unshift( a[i] );}b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]a = null;
Tips for using functions
Unfortunately, the for
cycle is not elegant enough and is not easy to maintain. Is there a better way?
Here is our first attempt to use Array#reduce
:
// `b` onto `a`:a = b.reduce( function(coll,item){ coll.push( item ); return coll;}, a );a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]// or `a` into `b`:b = a.reduceRight( function(coll,item){ coll.unshift( item ); return coll;}, b );b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Array#reduce(..)
With Array#reduceRight(..)
looks good, just a little clumsy. It is a =>
pity that the arrow expressions in ES6 can "slim" them appropriately, but still need to make a function call for each element.
How about the following method:
// `b` onto `a`:a.push.apply( a, b );a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]// or `a` into `b`:b.unshift.apply( b, a );b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
It looks a lot better, doesn't it? In particular, there is unshift(..)
no longer a need to take into account the traversal order. The expand operator in ES6 (spread operator) is better: a.push( ...b )
or b.unshift( ...a )
.
But the princess and the Prince did not live happily ever after. In both cases, a
expanding the operator to or from the b
apply(..)
second argument (or through ...
) means that the array needs to be expanded as a function parameter.
The first major problem is that the number of elements in the array we are appending is doubling (of course, temporary), since the contents of the array are essentially copied to the function call stack. In addition, different JS engines are implemented differently, and the limit on the number of parameters that can be passed into a function varies.
So, if there are 1 million elements in the array to append, it will almost certainly exceed the size of the function push(..)
and unshift(..)
the call stack limit. Well! Thousands of elements should be no problem, but be careful to set a reasonable security limit.
Note: You can try it splice(..)
, but the conclusion is the push(..)
unshift(..)
same.
One possible way is to still use the above method, dividing the array into a security-scoped fragment for batching:
function combineInto(a,b) { var len = a.length; for (var i=0; i < len; i=i+5000) { b.unshift.apply( b, a.slice( i, i+5000 ) ); }}
Wait, let's go back to the old topic of readability (or the efficiency of execution). Let's just stop here before we get rid of all the effective ways we have now.
Summarize
Array#concat(..)
is an effective way to merge two (or even more) arrays. The implicit risk, however, is that it creates a new array directly, rather than modifying it on the basis of the original array.
There are a number of possible ways to make changes based on the original array, but there is a degree of compromise.
The pros and cons of different methods, including those not shown here, may be the best way to do reduce(..)
that reduceRight(..)
.
Regardless of which method you choose, you need to think critically about the array merge strategy, rather than taking it for granted.
Reprint: Translation | The number of JavaScript tips is combined and