Skip to content
Advertisement

“less” consumes lots of RAM for a piped output on gzipped file, why?

Excuse me for useless cat and echo upfront, but while running less on a ~2GB .gz file I’m seeing ~25GB of RAM getting consumed (despite the output being piped into awk and consumed there):

JavaScript

I expected above to complete without any need for RAM, but to my surprise here is how it looked like ~2.5h later (by the time it was 89.8% into reading the .gz):

JavaScript

I’ll try out some other options (like rewriting my command with direct gzip -dc or zcat) to see if those could help, but if someone could let know WHY is this happening with less (or any other commands), whether it’s a known less or bash bug fixed in a later versions? Maybe some shell tricks to force less to behave properly?

P.S. stats.gz is 25261745373 bytes uncompressed (5 times wrapped around MAX_INT):

JavaScript

Advertisement

Answer

less stores all data in memory. This is what allows you to scroll up.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement