const testData = [Array(100000).keys()];
const evens = testData.reduce((acc, v) => {
if(v % 2 === 0) {
return [acc, v];
}
return acc;
}, []);
console.log(evens.length);
const testData = [Array(100000).keys()];
const evens = testData.reduce((acc, v) => {
if(v % 2 === 0) {
acc.push(v);
}
return acc;
}, []);
console.log(evens.length);
--enable-precise-memory-info
flag.
Test case name | Result |
---|---|
New array with spread | |
Accumulator push |
Test name | Executions per second |
---|---|
New array with spread | 0.3 Ops/sec |
Accumulator push | 195.0 Ops/sec |
Let's dive into the world of JavaScript microbenchmarks on MeasureThat.net.
What is being tested?
The provided JSON represents two benchmark test cases that compare the performance of creating an array using two different methods:
...
) to create a new array from an existing one.reduce()
function with accumulator updates to build up the resulting array.Options compared
The benchmark is comparing these two methods, which are both used for creating arrays in JavaScript. Here's a brief overview of each approach:
...
). The new array has its own memory allocation.Pros and cons of each approach
Library and syntax considerations
There is no explicit library mentioned in the benchmark JSON. However, Array.prototype.reduce()
is a built-in JavaScript method used in both test cases.
No special JavaScript features or syntax are explicitly mentioned in the benchmark. If we were to analyze further, we might find some minor optimizations or techniques used in the code, but they would not be specific to any particular feature or syntax.
Other alternatives
If we consider alternative approaches for creating arrays, we might also look at:
Array.prototype.slice()
and copying elements from a source array.Array.from()
to create an array from an iterable.However, the benchmark is specifically focused on comparing the performance of these two methods, which are both part of the standard JavaScript library.
Let me know if you have any further questions!