Skip to content
Advertisement

Bash increase pid kernel to unlimited for huge loop

I’ve been try to make cURL on a huge loop and I run the cURL into background process with bash, there are about 904 domains that will be cURLed

and the problem is that 904 domains can’t all be embedded because of the PID limit on the Linux kernel. I have tried adding pid_max to 4194303 (I read in this discussion Maximum PID in Linux) but after I checked only domain 901 had run in background proccess, before I added pid_max is only around 704 running in the background process.

here is my loop code :

count=0
while IFS= read -r line || [[ -n "$line" ]]; 
    do
      (curl -s -L -w "\n\nNo:$count\nHEADER CODE:%{http_code}\nWebsite : $line\nExecuted at :$(date)\n==================================================\n\n" -H "X-Gitlab-Event: Push Hook" -H 'X-Gitlab-Token: '$SECRET_KEY --insecure $line >> output.log) &

  (( count++ ))
done < $FILE_NAME

Anyone have another solution or fix it to handle huge loop to run cURL into background process ?

Advertisement

Answer

a script example.sh can be created

#!/bin/bash

line=$1
curl -s -L -w "\n\nNo:$count\nHEADER CODE:%{http_code}\nWebsite : $line\nExecuted at :$(date)\n==================================================\n\n" -H "X-Gitlab-Event: Push Hook" -H 'X-Gitlab-Token: '$SECRET_KEY --insecure $line >> output.log

then the command could be (to limit number of running process at a time to 50)

xargs -n1 -P50 --process-slot-var=count ./example.sh < "$FILE_NAME"
Advertisement