Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers. This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers. Closed 4 years ago. Improve this question
Tag: sparse-file
numpy.memmap: bogus memory allocation
I have a python3 script that operates with numpy.memmap arrays. It writes an array to newly generated temporary file that is located in /tmp: The size of the HDD is only 250G. Nevertheless, it can somehow generate 10T large files in /tmp, and the corresponding array still seems to be accessible. The output of the script is following: The file