I have seafile (http://www.seafile.com/en/home/) running on my NAS and I set up a cron tab that runs a script every few minutes to check if the seafile server is up, and if not, it will start it
The script looks like this:
#!/bin/bash # exit if process is running if ps aux | grep "[s]eafile" > /dev/null then exit else # restart process /home/simon/seafile/seafile-server-latest/seafile.sh start /home/simon/seafile/seafile-server-latest/seahub.sh start-fastcgi fi
running /home/simon/seafile/seafile-server-latest/seafile.sh start
and /home/simon/seafile/seafile-server-latest/seahub.sh start-fastcgi
individually/manually works without a problem, but when I try to manually run this script file, neither of those lines execute and seafile/seahub do not start
Is there an error in my script that is preventing execution of those 2 lines? I’ve made sure to chmod the script file to 755
Advertisement
Answer
The problem is likely that when you pipe commands into one another, you don’t guarentee that the second command doesn’t start before the first (it can start, but not do anything while it waits for input). For example:
oj@ironhide:~$ ps -ef | grep foo oj 8227 8207 0 13:54 pts/1 00:00:00 grep foo
There is no process containing the word “foo” running on my machine, but the grep
that I’m piping ps
to appears in the process list that ps
produces.
You could try using pgrep
instead, which is pretty much designed for this sort of thing:
if pgrep "[s]eafile"
Or you could add another pipe to filter out results that include grep
:
ps aux | grep "[s]eafile" | grep -v grep