Support Questions

Find answers, ask questions, and share your expertise

Who agreed with this topic

HADOOP_HOME are not set when try to run spark application in yarn cluster mode

avatar
I am trying to run an application in yarn cluster mode.

Here are setting of the shell script:
spark-submit --class "com.Myclass"  \
--num-executors 2 \
 --executor-cores 2 \
 --master yarn \
 --supervise \
 --deploy-mode cluster \
../target/ \

My application is working fine in yarn-client and local mode.

EXCERPT FROM SPARK-SUBMIT UNDER YARN CLUSTER MODE

&&&&&&&&&&&&&&&&&&&&&& HADOOP HOME 
/usr/lib/hadoop
&&&&&&&&&&&&&&&&&&&&&& HADOOP_CONF_DIR 
/usr/lib/hadoop/etc/hadoop

...
Diagnostics: Exception from container-launch.
Container id: container_1454984479786_0006_02_000001
Exit code: 15
Stack trace: ExitCodeException exitCode=15: 
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:543)
    at org.apache.hadoop.util.Shell.run(Shell.java:460)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:720)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:210)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)

Further I am getting following error 
ERROR DETAILS FROM YARN LOGS APPLICATIONID
INFO : org.apache.spark.deploy.yarn.ApplicationMaster - Registered signal handlers for [TERM, HUP, INT]
DEBUG: org.apache.hadoop.util.Shell - Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
    at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:307)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:332)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
    at org.apache.hadoop.yarn.conf.YarnConfiguration.<clinit>(YarnConfiguration.java:590)
    at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.newConfiguration(YarnSparkHadoopUtil.scala:62)
    at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:52)
    at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.<init>(YarnSparkHadoopUtil.scala:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at java.lang.Class.newInstance(Class.java:374)
    at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:386)
    at org.apache.spark.deploy.SparkHadoopUtil$.yarn$lzycompute(SparkHadoopUtil.scala:384)
    at org.apache.spark.deploy.SparkHadoopUtil$.yarn(SparkHadoopUtil.scala:384)
    at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:401)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:623)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

I tried modifying spark-env.sh like following and I see Hadoop_Home logged but still getting error:

Modified added following entries to spark-env.sh 
export HADOOP_HOME="/usr/lib/hadoop"
echo "&&&&&&&&&&&&&&&&&&&&&& HADOOP HOME " 
echo "$HADOOP_HOME"
export HADOOP_CONF_DIR="$HADOOP_HOME/etc/hadoop"
echo "&&&&&&&&&&&&&&&&&&&&&& HADOOP_CONF_DIR " 
echo "$HADOOP_CONF_DIR"
Who agreed with this topic