Skip to content
Advertisement

Multithreaded Programming with Bash Script

I am running a bash script as like that:

for i in {0..3250000..50000}
  do
    wget "http://xxx/select?q=*:*&row_size=50000&start=$i" -O $i.csv
  done

Every time when I send a request I have to wait to finish it and write to a file and after that it continues to looping. However I want to do it asynchronously. I mean that it will send a request and loop without waiting response. However when a response comes it will do the proper thing.

How can I do that?

Advertisement

Answer

You can use xargs:

printf '%s' {0..50000..3250000} |
    xargs -0 -I {} -n 1 -P 20 
    wget 'http://xxx/select?q=*:*&row_size=50000&start={}' -O {}.csv

The -0 selects the NULL character as delimiter, -I {} replaces {} with the argument, -n 1 hands over a single argument to wget and -P 20 processes 20 requests at a time, in parallel.

Alternatively you can append a & to your command line to execute it in background and wait for the processes to finish.

User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement