The main objective here is to create a daemon-spawning function. The daemons need to run arbitrary programs (i.e. use subprocess
).
What I have so far in my daemonizer.py
module is:
import os from multiprocessing import Process from time import sleep from subprocess import call, STDOUT def _daemon_process(path_to_exec, std_out_path, args, shell): with open(std_out_path, 'w') as fh: args = (str(a) for a in args) if shell: fh.write("*** LAUNCHING IN SHELL: {0} ***nn".format(" ".join([path_to_exec] + list(args)))) retcode = call(" ".join([path_to_exec] + list(args)), stderr=STDOUT, stdout=fh, shell=True) else: fh.write("*** LAUNCHING WITHOUT SHELL: {0} ***nn".format([path_to_exec] + list(args))) retcode = call([path_to_exec] + list(args), stderr=STDOUT, stdout=fh, shell=False) if retcode: fh.write("n*** DAEMON EXITED WITH CODE {0} ***n".format(retcode)) else: fh.write("n*** DAEMON DONE ***n") def daemon(path_to_executable, std_out=os.devnull, daemon_args=tuple(), shell=True): d = Process(name='daemon', target=_daemon_process, args=(path_to_executable, std_out, daemon_args, shell)) d.daemon = True d.start() sleep(1)
When trying to run this in bash (This will create a file called test.log
in your current directory.):
python -c"import daemonizer;daemonizer.daemon('ping', std_out='test.log', daemon_args=('-c', '5', '192.168.1.1'), shell=True)"
It correctly spawns a daemon that launches ping
but it doesn’t respect the arguments passed. This is true if shell is set to False
as well. The log-file clearly states that it attempted to launch it with the arguments passed.
As a proof of concept creating the following executable:
echo "ping -c 5 192.168.1.1" > ping_test chmod +x ping_test
The following works as intended:
python -c"import daemonizer;daemonizer.daemon('./ping_test', std_out='test.log', shell=True)"
If I test the same call
code outside of the multiprocessing.Process
-target it does work as expected.
So how do I fix this mess so that I can spawn processes with arguments?
I’m open to entirely different structures and modules, but they should be included among the standard ones and be compatible with python 2.7.x. The requirement is that the the daemon
function should be callable several times asynchronously within a script and produce a daemon each and their target processes should be able to end up on different CPUs. Also the scripts need to be able to end without affecting the spawned daemons of course.
As a bonus, I noticed I needed to have a sleep
for the spawning to work at all else the script terminates too fast. Any way to get around that arbitrary hack and/or how long do I really need to have it wait to be safe?
Advertisement
Answer
Your arguments are being “used up” by the printing of them!
First, you do this:
args = (str(a) for a in args)
That creates a generator, not a list or tuple. So when you later do this:
list(args)
That consumes the arguments, and they will not be seen a second time. So you do this again:
list(args)
And get an empty list!
You could fix this by commenting out your print statements, but much better would be to simply create a list in the first place:
args = [str(a) for a in args]
Then you can use args
directly and not list(args)
. And it will always have the arguments inside.