var data = { Array.from(Array(100).keys()) };
Object.fromEntries(Object.entries(data).map((key, value) => [key, value]));
Object.entries(data).reduce((acc, [k, v]) => {
acc[k] = v.toString();
return acc;
}, {});
Object.entries(data).reduce((acc, [k, v]) => ({
acc,
[k]: v.toString()
}), {});
--enable-precise-memory-info
flag.
Test case name | Result |
---|---|
Object.fromEntries | |
Reduce (reuse object) | |
Reduce (creating temporary objects) |
Test name | Executions per second |
---|---|
Object.fromEntries | 52808.7 Ops/sec |
Reduce (reuse object) | 431260.7 Ops/sec |
Reduce (creating temporary objects) | 175417.5 Ops/sec |
The benchmark provided tests three different approaches to transforming an object whose keys are integers from 0 to 99 into a new object while converting the values to strings. The object is created using a spread operation on an array derived from a range of numbers.
Object.fromEntries with map:
Object.fromEntries(Object.entries(data).map((key, value) => [key, value]));
Object.entries
to convert the object into an array of key-value pairs, applies a map
function to transform these pairs (although the implementation accidentally passes key
and value
, which should be [key, value]
destructured), and then converts it back to an object using Object.fromEntries
.map
, especially since it needs to reconstruct the array on each iteration before being converted back to an object.Reduce (reuse object):
Object.entries(data).reduce((acc, [k, v]) => { acc[k] = v.toString(); return acc; }, {});
Object.entries
to get the key-value pairs and processes them using reduce
to construct a new object directly by adding properties to the accumulator object acc
.acc
directly, avoiding the overhead of creating intermediate data structures.map
, which might reduce readability for some developers.Reduce (creating temporary objects):
Object.entries(data).reduce((acc, [k, v]) => ({ ...acc, [k]: v.toString() }), {});
reduce
, but creates a new copy of the accumulator object each time through the loop using spread syntax (...acc
).From the benchmark results, we see execution speeds performed across a specific browser environment:
reduce
with a reused object. Clearly, the overhead of creating new objects impacts performance dramatically, illustrating the importance of considering memory and performance when processing large datasets.Object.entries
, Object.fromEntries
, and the spreading syntax to construct objects. These methods are part of the core JavaScript language and improve both the readability and maintainability of the code, although their use can have varying performance implications.for...in
loops, forEach
, or even manual loops implemented with for
, which could offer different balances between clarity, expressiveness, and performance. Additionally, libraries such as Lodash provide utility functions for manipulating objects and may further enhance clarity and reusability at the cost of additional overhead.