I want to submit a sge job via sge file.
For example, I have run.sge file as follows:
## SGE options #$ -cwd #$ -l mem_free=2G ##$ -q all.q@compute-2-3.local #$ -pe mpich 4 #$ -S /bin/bash SCRATCH_DIR=/state/partition1/$USER-$JOB_ID mkdir $SCRATCH_DIR qsub run_inp.sge rmdir $SCRATCH_DIR
And run_inp.sge file as follows:
## SGE options #$ -cwd #$ -l mem_free=2G ##$ -q all.q@compute-2-3.local #$ -pe mpich 4 #$ -S /bin/bash INPUTFILE="main.inp" JOB="main" SCRATCH_DIR=/state/partition1/$USER-$JOB_ID mkdir $SCRATCH_DIR /share/apps/simula/Commands/abaqus double inp=$INPUTFILE scratch=$SCRATCH_DIR job=$JOB mp_mode=mpi interactive rmdir $SCRATCH_DIR
Whenever I submit job via
qsub run.sge
I got this error:
Unable to run job: denied: host "compute-2-9.local" is no submit host. Exiting.
But if I submit run_inp.sge directly, it works fine:
qsub run_inp.sge
My question is that can I submit sge jobs inside a sge job? If not, is there alternative way to do such thing?
Advertisement
Answer
The normal approach is to submit all jobs from one host where you login, submit your jobs, and check job status. The submitted jobs are scheduled to execute on compute nodes in the cluster. If you have multiple jobs and dependencies between the jobs, you can declare the dependencies to the job scheduler. For example, you can tell the scheduler to wait for job A to finish before starting job B.