Support Questions

Find answers, ask questions, and share your expertise

java.io.IOException: Cannot run program "java" (in directory ""): error=2, No such file or directory

avatar
New Contributor

Hi,

We are starting a java process from each executor of our streaming job. To start that java process we are using the following code:

SparkProcess process = new SparkProcess();

process.setExecDir(sparktemp);

process.setCommand("java");

process.addParameters("-Xmx" + xmx);

process.addParameters("-cp");

process.addParameters(sparktemp + "<jar_name>");

process.addParameters("<full class name>");

we are getting the following error:

Cannot run program "java" (in directory "<complete_path_of_jar>"): error=2, No such file or directory at com.subex.roc.dataload.kafka2hive.StorageFunction$1.call(StorageFunction.java:631) at com.subex.roc.dataload.kafka2hive.StorageFunction$1.call(StorageFunction.java:554) at org.apache.spark.api.java.JavaRDDLike$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:219) at org.apache.spark.api.java.JavaRDDLike$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:219) at org.apache.spark.rdd.RDD$anonfun$foreachPartition$1$anonfun$apply$29.apply(RDD.scala:926) at org.apache.spark.rdd.RDD$anonfun$foreachPartition$1$anonfun$apply$29.apply(RDD.scala:926) at org.apache.spark.SparkContext$anonfun$runJob$5.apply(SparkContext.scala:2069) at org.apache.spark.SparkContext$anonfun$runJob$5.apply(SparkContext.scala:2069) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: Cannot run program "java" (in directory "<complete_path_of_jar>"): error=2, No such file or directory at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) at com.subex.sparkcommon.process.SparkProcess.execute(SparkProcess.java:50) at com.subex.roc.dataload.kafka2hive.StorageFunction$1.call(StorageFunction.java:615) ... 13 more Caused by: java.io.IOException: error=2, No such file or directory at java.lang.UNIXProcess.forkAndExec(Native Method) at java.lang.UNIXProcess.<init>(UNIXProcess.java:247) at java.lang.ProcessImpl.start(ProcessImpl.java:134) at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) ... 15 more

We gave all permissions to the jar path and also to java path but still we are getting this error. The bashrc path has the java home and the path is also set to $JAVA_HOME/bin.

We are using HDP version 2.6.3

Please suggest what am I missing here. Also, please let me know should you need more info.

Thanks

Chandan

1 REPLY 1

avatar
Rising Star

With the same user used to run this application, what is the output of running in all and each of the nodemanager nodes "which java"?

ProcessBuilder doesn't use the locations in environment variables, it looks for "java" in "/usr/bin/java", is that the java binary you're working on giving permissions?