const testData = [Array(100000).keys()];
const evens = testData.reduce((acc, v) => {
if(v % 2 === 0) {
acc = [acc, v];
}
return acc;
}, []);
console.log(evens.length);
const testData = [Array(100000).keys()];
const evens = testData.reduce((acc, v) => {
if(v % 2 === 0) {
acc.push(v);
}
return acc;
}, []);
console.log(evens.length);
--enable-precise-memory-info
flag.
Test case name | Result |
---|---|
New array with spread | |
Accumulator push |
Test name | Executions per second |
---|---|
New array with spread | 0.2 Ops/sec |
Accumulator push | 156.7 Ops/sec |
This benchmark tests the performance of two different methods for filtering even numbers from an array in JavaScript:
Method 1: New Array with Spread (...
) Operator:
const testData = [...Array(100000).keys()]; // Creates an array of 100,000 numbers from 0 to 99,999
const evens = testData.reduce((acc, v) => {
if (v % 2 === 0) {
acc = [...acc, v]; // Create a new array with the current value if it's even
}
return acc;
}, []); // Start with an empty accumulator array
console.log(evens.length);
Method 2: Accumulator push()
Method:
const testData = [...Array(100000).keys()];
const evens = testData.reduce((acc, v) => {
if (v % 2 === 0) {
acc.push(v); // Add the current value to the accumulator array if it's even
}
return acc;
}, []);
console.log(evens.length);
Pros and Cons:
Method 1 (Spread Operator):
Method 2 (Accumulator push()
):
Other Considerations:
Alternatives:
filter()
directly instead of reduce()
, which might be more performant:const evens = testData.filter(v => v % 2 === 0);
Let me know if you have any other questions!