I was running a script to read sequentially a lot of gzipped files, and print on stdout the result of a computation for each file.
After about 16'000 files, the process just stopped and Killed was printed on the terminal.
I guess that the memory used by the process increases until the kernel decides to kill it.
I could reduce the code to this testcase:
'use strict';
var zlib = require('zlib');
var data = 'abcdefghijklmnopqrstuvwxyz';
var gzipped = zlib.gzipSync(data);
while (true) {
var contents = zlib.gunzipSync(gzipped);
process.stdout.write(contents.toString() + '\n');
}
> node test.js > /dev/null
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
[1] 15363 abort (core dumped) node test.js > /dev/null
This is what I see with top command just before the process disappears:
> top
15363 mzasso 20 0 10,240g 3,716g 11776 R 111,0 24,2 0:53.59 node
NB: a similar code, that would just write data in the while loop keeps a stable memory consumption
I was running a script to read sequentially a lot of gzipped files, and print on stdout the result of a computation for each file.
After about 16'000 files, the process just stopped and
Killedwas printed on the terminal.I guess that the memory used by the process increases until the kernel decides to kill it.
I could reduce the code to this testcase:
This is what I see with top command just before the process disappears:
> top 15363 mzasso 20 0 10,240g 3,716g 11776 R 111,0 24,2 0:53.59 nodeNB: a similar code, that would just write
datain the while loop keeps a stable memory consumption