var array = Array.from({length: 10}, () => Math.floor(Math.random() * 140));
const f = [ new Set(array)]
const s = new Set(array)
const l = Array.from(s)
const b = array.filter((i,index) => array.indexOf(i)=== index)
--enable-precise-memory-info
flag.
Test case name | Result |
---|---|
Set spread | |
Array from set | |
Filter |
Test name | Executions per second |
---|---|
Set spread | 957113.9 Ops/sec |
Array from set | 555341.2 Ops/sec |
Filter | 3155825.0 Ops/sec |
The benchmark titled "Set vs Filter for unique 2" compares different methods of extracting unique values from an array in JavaScript. It uses a randomly generated array of numbers created by the script preparation code and tests three distinct approaches: using a Set
, the filter
function, and a combination of Set
and Array.from
.
Set Spread:
const f = [... new Set(array)]
Set
from the original array, which inherently removes duplicates, and then uses the spread operator (...
) to convert the Set
back into an array. Set
, which is specifically designed for storing unique values.Set
object, which may incur additional overhead in terms of memory usage and processing time compared to direct array operations.Array from Set:
const s = new Set(array); const l = Array.from(s)
Set
and then uses Array.from
to create an array from the Set
.Array.from
can sometimes enhance readability especially for more complex data transformations.Set
, and the two-step conversion (Set to Array) could be slightly less efficient compared to using the spread operator directly.Filter:
const b = array.filter((i,index) => array.indexOf(i)=== index)
filter
method to create a new array containing only unique values. It checks if the current value's index matches the first occurrence of that value.Set
.indexOf
, leading to a quadratic time complexity of O(n^2). This method is generally less efficient for large datasets.The execution results indicate that the filter
method was the fastest, achieving approximately 2,867,309.25 executions per second, followed by the Set Spread at about 1,777,921.625 executions per second, and then the Array from Set at 1,698,718.625 executions per second. While the results suggest that filter
performed better in this specific test case, one should take into account that performance can vary based on multiple factors, including the size of the input array and the environment in which the benchmark is running.
Apart from these approaches, there are additional alternatives to extract unique values from an array:
Using Object Keys: A common pattern using an object to track unique items:
const unique = array.reduce((acc, item) => {
acc[item] = true;
return acc;
}, {});
const result = Object.keys(unique);
Set
.Sorting and Filtering: Sorting the array first and then filtering adjacent duplicates:
const unique = array.sort().filter((value, index, self) => {
return index === 0 || value !== self[index - 1];
});
Each approach has its context and implications, and the best choice may depend on the specific requirements of a project, including data size, performance needs, and code readability.