Support Questions

Find answers, ask questions, and share your expertise

Spark-shell throwing an error while connecting yarn master mode in HDP 3.0?

avatar
Super Collaborator

Exception in thread "main" java.lang.Exception: When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment. at org.apache.spark.deploy.SparkSubmitArguments.validateSubmitArguments(SparkSubmitArguments.scala:288) at org.apache.spark.deploy.SparkSubmitArguments.validateArguments(SparkSubmitArguments.scala:248) at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:120) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:130) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [spark@ip-10-0-10-76 ~]$

1 ACCEPTED SOLUTION

avatar
Super Collaborator

The above error wiped out once we have added HADOOP_CONF_DIR and YARN_CONF_DIR in .bashrc file in users home directory

View solution in original post

1 REPLY 1

avatar
Super Collaborator

The above error wiped out once we have added HADOOP_CONF_DIR and YARN_CONF_DIR in .bashrc file in users home directory