Skip to content

Failure of bash script called via command substitution does not stop parent script

I have a bash script (


set -e

for row in $(./;
  echo $?
  echo outer=$row

echo "continuing"

which invokes another bash script


echo "A"
echo "B"

exit 1

I want the first script to fail fast when second script exits with error. (The second script actually reads rows from database to stdout, in case of database connectivity error it returns nonzero exit code.)

I expect the set -e option causes premature termination of first script However from script output it seems that even the exitcode from second script is passed to first script the loop is performed and script continues beyond the loop:


I want neither the loop nor any command after loop to be executed. I understand the second script had passed some data to first script before it exited with error so loop processed them. I don’t understand why loop didn’t stop then and what’s the correct fix. The best thing I could figure out is to store result of command substitution into array, which works.

# execution won't get here when error
echo $?
for row in $a

Is there a way to do this without anything being executed? I played with inherit_errexit as I found here but with no success.



Your idea for a solution is good but a=$(./ doesn’t populate an array, it populates a string and then for row in $a is leaving the contents of that string unquoted and so open to the shell for interpretation. You can do this to make/use a as an array if the output of is as simple as you show:

a=( $( ./ ) )
(( $? == 0 )) || exit 1
echo "$?"
for row in "${a[@]}"

but rather than a=( $(./ ) which has some caveats, it’d be more robust to do:

IFS=$'n' read -r -d '' -a a < <( ./ && printf '' )


readarray -t a < <( ./ )

See Reading output of a command into an array in Bash and How to split a string into an array in Bash?

5 People found this is helpful