const combined = [
{oclcNumber: 1},
{oclcNumber: 2},
{oclcNumber: 3},
{oclcNumber: 4},
{oclcNumber: 5},
{oclcNumber: 6},
{oclcNumber: 7},
{oclcNumber: 8},
{oclcNumber: 9},
{oclcNumber: 10},
{oclcNumber: 2},
]
const combined = [
{oclcNumber: 1},
{oclcNumber: 2},
{oclcNumber: 3},
{oclcNumber: 4},
{oclcNumber: 5},
{oclcNumber: 6},
{oclcNumber: 7},
{oclcNumber: 8},
{oclcNumber: 9},
{oclcNumber: 10},
{oclcNumber: 2},
]
combined.reduce((acc, current) => {
return (acc.find(record => record.oclcNumber === current.oclcNumber) ? acc : acc.concat(current))
}, [])
const combined = [
{oclcNumber: 1},
{oclcNumber: 2},
{oclcNumber: 3},
{oclcNumber: 4},
{oclcNumber: 5},
{oclcNumber: 6},
{oclcNumber: 7},
{oclcNumber: 8},
{oclcNumber: 9},
{oclcNumber: 10},
{oclcNumber: 2},
]
combined.filter((current, i, arr) => {
return arr.findIndex(record => (record.oclcNumber === current.oclcNumber)) === i
})
--enable-precise-memory-info
flag.
Test case name | Result |
---|---|
using reduce function | |
using filter function |
Test name | Executions per second |
---|---|
using reduce function | 324955.9 Ops/sec |
using filter function | 6743032.5 Ops/sec |
Overview of the Benchmark
The provided benchmark compares the performance of two JavaScript functions: reduce
and filter
. The goal is to eliminate duplicates from an array of objects.
Library Used
In this benchmark, no external libraries are used beyond what comes with JavaScript. However, it's worth noting that some browsers might include additional libraries or polyfills by default, which could potentially affect the results.
Special JS Features/Syntax
None of the provided code utilizes special JavaScript features or syntax that would affect its execution. The focus is solely on comparing the performance of two built-in functions: reduce
and filter
.
Options Compared
The benchmark compares the performance of:
acc
) to keep track of seen values.Pros and Cons
Here's a brief summary:
Other Considerations
Benchmark Results
The benchmark results show that the filter
function is faster for this specific test case on a Mac OS X 10.15.6 system with Chrome 86. The execution per second for the using filter function
test is higher than for the using reduce function
.
Overall, while both functions have their trade-offs, the choice between them should be based on the specific requirements of your use case. If readability and maintainability are more important than raw performance, the filter function might be a better choice. However, if you're dealing with very large arrays or need to minimize memory usage, the reduce function could be a better option.
Alternative Approaches
Other alternatives for eliminating duplicates from an array of objects include:
Map
and Values()
: Instead of using reduce, you could use Map to keep track of unique values and then access the map's values property.Keep in mind that each approach has its own trade-offs in terms of readability, maintainability, performance, and memory usage.