I have a folder, which contains 10 folders, which contain a huge mess of files. I’d like to see, not which files specifically are using the most, but which kinds of files (.png, .jpg, .txt etc.) are using the most disk space. I saw a previous post that listed the disk usage of a file type by giving the extension,
Tag: du
AWS instance not showing the correct free space
I use AWS and I got an EC2 instance. When I run the command du -h –max-depth=1 I get: and When I run the command df -h I get: It look I use only 1GB but it show that I am using 7.1GB. So I run the command lsof +L1 to locate some deleted file that process are still using
Find usage % of a directory in the disk [closed]
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers. This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question
how to get size of folder including apparent size of sparse files? (du is too slow)
I have a folder containing a lot of KVM qcow2 files, they are all sparse files. Now I need to get the total size of folder, the qcow2 file size should be counted as apparent size(not real size). for example: image: c9f38caf104b4d338cc1bbdd640dca89.qcow2 file format: qcow2 virtual size: 100G (107374182400 bytes) disk size: 3.3M cluster_size: 65536 the image should be treated