Skip to content
Advertisement

Spliting large scm files into individual commands

I have a very large scm file that is over 3 million lines long. I am ‘cat’ing it through a telnet port to a virtual server. I need to split the code into its individual lines and feed it into the server. It should send one line and wait a few milliseconds before sending the next line. Ex:

File:

(define x (WordNode = "frustration")) n

(define x (WordNode = "Anger")) n

Input:
(define x (WordNode = "frustration")) n

sleep 50 ms

(define x (WordNode = "Anger")) n

sleep 50 ms

Advertisement

Answer

If you can use GNU sleep (which supports sleeping for fractional number of seconds), it’s easy:

#!/bin/bash
while IFS= read -r line; do
    echo "$line"
    sleep 0.05
done < file

Or, turn that into a small delay-line utility (more in line with Unix philosophy).

Let’s call it delay.sh (don’t forget to chmod +x delay.sh):

#!/bin/bash
while IFS= read -r line; do
    echo "$line"
    sleep 0.05
done

We are reading each line from the standard input, and output it with a delay to the standard output.

Use it, for example, like this:

head -100 file | ./delay.sh | ...

This will take the first 100 lines from file and feed it, one by one, with a delay to the next command in the pipeline (perhaps the telnet you mentioned in your question).

And to “delay” the complete file:

./delay.sh < file

Btw, if your file is 3M lines long (as you claim), bear in mind delaying each line for 50ms will take ~42h.

User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement