I have some shell scripts that works with pipes like such:
foo.sh | bar.sh
My bar.sh calls some command line program that can only take a certain number of lines of stdin. Thus, I want foo.sh‘s large stdout to be chunked up in N number of lines to make multiple bar.sh calls. Essentially, paginate foo.sh‘s stdout and do multiple bar.sh.
Is it possible? I am hoping for some magic in between the pipes like foo.sh | ??? | bar.sh. xargs -n doesn’t quite get me what I want.
Advertisement
Answer
I am nowhere near a machine to test this, but you need GNU Parallel to make this easy – along the lines of:
foo.sh | parallel --pipe -N 10000 -k bar.sh
As an added bonus, that will run as many bar.sh in parallel as you have CPU cores.
Add -j 1 if you only want one bar.sh at a time.
Add --dry-run if you want to see what it would do but without doinng anything.