<script src="https://cdn.jsdelivr.net/npm/lodash@4.17.5/lodash.min.js"></script>
var firstEqual = Array.from({length: 1000000}, (item, idx) => idx);
var secondEqual = Array.from({length: 1000000}, (item, idx) => idx);
var arrayToDedup = firstEqual.concat(secondEqual);
_.uniq(arrayToDedup);
[new Set(arrayToDedup)]
Array.from(new Set(arrayToDedup))
--enable-precise-memory-info
flag.
Test case name | Result |
---|---|
Lodash Uniq | |
Javascript Set Iterator | |
Javascript Set Array.from |
Test name | Executions per second |
---|---|
Lodash Uniq | 7.3 Ops/sec |
Javascript Set Iterator | 8.3 Ops/sec |
Javascript Set Array.from | 8.4 Ops/sec |
Let's break down the provided benchmark and explain what is being tested.
Benchmark Overview
The benchmark compares three approaches to remove duplicates from an array of 1,000,000 elements:
Set
data structure and its iterator methods to remove duplicates.Set
data structure and the Array.from()
method to create an array with unique values.Options Comparison
The benchmark tests these three approaches by generating a large array of random numbers (arrayToDedup
) and then applying each approach to it.
Here's a brief summary of each approach:
Map
data structure internally to keep track of the unique elements.Set
data structure, which automatically removes duplicates. The test creates a set from the input array using new Set(arrayToDedup)
, and then converts it back to an array using Array.from()
. However, since sets are unordered, this approach may not be suitable for all use cases.Array.from()
method instead of converting the set directly. This approach can be slightly more efficient than the iterator approach because it avoids the overhead of creating a new array.Pros and Cons
Here are some pros and cons for each approach:
Library and Special JS Features
The benchmark uses the following library:
No special JS features or syntax are used in this benchmark. It only relies on standard JavaScript data structures and methods.
Alternatives
If you're interested in alternative approaches to removing duplicates from an array, here are a few options:
filter()
: You can use the filter()
method to create a new array with unique values: arrayToDedup.filter((item, idx) => arrayToDedup.indexOf(item) === idx);
reduce()
: You can use the reduce()
method to create an object and then convert it back to an array with unique values: {[item]: (obj[item] || []).push(item), obj} = arrayToDedup.reduce((obj, item) => ({[item]: (obj[item] || []).push(item), obj}), {});
Keep in mind that these alternatives may have different performance characteristics compared to the benchmarked approaches.