You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fs: do bulk file reads to optimize cache extraction (#3539)
* fs: do bulk file reads to optimize cache extraction
This patch boosts cache extraction by ~2x+ by letting node
do more parallelization work. This makes nearly all of the file copy
stuff be done by the C++ code with minimal boundary-crossing (at least
compared to node streams).
Streams in node.js are ~3x slower, specially for small files,
than just doing fs.writeFile/readFile, because of this boundary. This
is something Yarn might want to take into account in other places.
The reason this is OK is because pretty much any files this would
handle would fit neatly into memory (any npm packages MUST fit
into memory by definition, because of the way npm@<5 does extracts).
If you really want to make doubleplus sure to minimize memory usage,
you could do an fs.stat to find the file size and then do heuristics
to only use streams for files bigger than <X>MB.
* Uses readFileBuffer instead of readFile
0 commit comments