I have this script.
log_file_name=dataProcessor export pwd=`pwd` echo "in data processing" log_file_path=LOGS/data-processor/ logfile=$log_file_path$log_file_name.log log_file_home_path=LOGS export LOG_FILE_HOME_PATH=$log_file_home_path export DATA_EXTRACTION_CONFIG_FILE_LOCATION=$pwd/config.properties export LIBJARS=/home/data-extraction/2.12.2.1/data-extraction/lib/*.jar export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/home/abc/lib/*.jar echo `$HADOOP_CLASSPATH` hadoop jar data-processor-*.jar com.impl.JobSubmission -libjars ${LIBJARS} &> $log_file_home_path/process.log echo "in data processing done"
When I execute it from putty, it works properly, but it fails when we execute it using java ProcessBuilder. It fails with “Hadoop command not found” error. Below is the Java code, which I am using to execute script.
ProcessBuilder builder = new ProcessBuilder(command.trim()); //builder.redirectErrorStream(true); Process process = builder.start(); exitStatus = process.waitFor(); BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream())); String line = ""; while ((line = reader.readLine()) != null) { output.append(line + "n"); System.out.print(line); }
Advertisement
Answer
Problem was the other script. Script one(1) starts Java application, which internally calls second script(s2), which was falling. Script one(S1), was setting invalid PATH, instead of PATH, path was used, which was creating whole problem.