Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Spark-shell throwing an error while connecting yarn master mode in HDP 3.0?

avatar
Super Collaborator

Exception in thread "main" java.lang.Exception: When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment. at org.apache.spark.deploy.SparkSubmitArguments.validateSubmitArguments(SparkSubmitArguments.scala:288) at org.apache.spark.deploy.SparkSubmitArguments.validateArguments(SparkSubmitArguments.scala:248) at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:120) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:130) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [spark@ip-10-0-10-76 ~]$

1 ACCEPTED SOLUTION

avatar
Super Collaborator

The above error wiped out once we have added HADOOP_CONF_DIR and YARN_CONF_DIR in .bashrc file in users home directory

View solution in original post

1 REPLY 1

avatar
Super Collaborator

The above error wiped out once we have added HADOOP_CONF_DIR and YARN_CONF_DIR in .bashrc file in users home directory