Skip to content
Advertisement

How to count all the files using ls -l statements separated by && in a single line?

I’m tyring to count all the files from several ls -l statements for certain file types separated by the double amperand symbol like so:

ls -l *.xml && ls -l *.json && ls -l *.md 

The technique I’ve seen for a single file type or all files will simply count all the end of line characters it finds using an egrep command: egrep -c '^-'

Here is a link for a single ls -l command to find the count for all files : Link to a question about ls -l count using egrep -c command in stack overflow.

If I count several ls -l statements using a command on a single line using like ‘ls -l’ and for each file type, how do I count each statement’s totals in Linux using sh or bash shell script?

I tried this and it doesn’t seem to work:

ls -l *.xml && ls -l *.json && ls -l *.md | egrep -c '^-'

I also tried:

ls -l *.xml && ls -l *.json && ls -l *.md | grep -v /$ | wc -l

Unfortunately, it doesn’t like the ‘&&’ symbols that concatenate the results, and it also doesn’t work with the ‘|’ (pipe symbol) for some reason. The pipe symbol must not work with the ‘&&’ symbol the way I’m using it.

Advertisement

Answer

First, you need to use -d option for ls, to make it not to expand the items that are directories and show all the files inside them. Second you need to cut the first line (the one that shows Total 45 at the top). Third using

ls -l *.xml && ls -l *.json && ls -l *.md 

is equivalente to

ls -l *.xml *.json *.md

so you can avoid two calls to ls and two processes.

There’s still an issue with ls and it is that you can have no *.xml files, and you will get (on stderr, that is) a *.xml: no such file. This is because the * wildcard expansion is made at the shell, and it is passed verbatim if the shell is unable to find any file with that name.

So, finally, a way to do so would be:

ls -ld *.xml *.json *.md | tail +1 | wc -l

Note:

I shouldn’t use -l option, it makes your search more difficult (ls takes more time as it has to stat(2) each file, it outputs the annoying Total blocks line at the top, that you have to eliminate, and you get a blank line between directory listings –this feature is eliminated if you specify option -d— that you should have to eliminate also). It will skip unreadable files (as the unreadable files cannot be stat(2)ed for info, and you are not using the extra information for anything. You can use just

ls -d *.xml *.json *.md | wc -l

to get only the names (and also strip the used blocks line at the top) that match the patterns you put in the command line. If the output of ls is piped to another command, then it doesn’t group the output in columns, as it does when outputting to a terminal.

Anyway, if I had to use some tool to count files, I should use find(1), instead (it allows far more flexibility on files selection, and allows you to search recursively a directory structure) as in:

find . ( -name "*.xml" -o -name "*.json" -o -name "*.md" ) -print | wc -l

or, if you want it to be only in the current directory, just add the option -depth 1 and you will not get it searching recursively into directories.

find . -depth 1 ( -name "*.xml" -o -name "*.json" -o -name "*.md" ) -print | wc -l
User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement