I’m looking at some old scripts and I found some parameter assignment that I have not seen before. A while loop reads from a text file and passes the values to a function. The items in the text file look like this:
user_one:abcdef:secretfolder
the first stage of the function then looks like this:
IFS=':' read -a param <<< $@
user="${param[0]}"
pass="${param[1]}"
user_folders="${param[2]}"
I have not seen this sort of assignment before and was wondering if this is just an alternative way of handling it. Is the above the same as this?
IFS=':' read -a param <<< $@
user="${1}"
pass="${2}"
user_folders="${3}"
(change in values to 1-3 due to ${0} being the name of the file itself). This script is 5 years old; This original sort of assignment just seems a longer way to to it, unless I’ve missed something
I’m still learning shell scripting but as I understand, setting IFS=':' will split the fields on : rather than whitespace and so in the examples, the value of "${param[0]}" and ${1} passed to the function would be user_one
Can someone please explain if there is a reason why "${param[0]}" should be used instead of ${1}?
Advertisement
Answer
The command:
IFS=':' read -a param <<< $@
reads the :-separated fields from the command arguments ($@) into the array variable named param. Bash arrays work just like lists in other languages, and you index them with brackets. ${param[0]} is the first field, ${param[1]} then next, and so on. Arrays like this can contain anything, and it’s just because of the $@ in the read command that this param array happens to contain the arguments. It could just as easily contain foo, bar, and baz if it were created like:
param=(foo bar baz)
The ${1}, ${2} etc. syntax always refers to the script arguments though.