- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Spark-shell throwing an error while connecting yarn master mode in HDP 3.0?
- Labels:
-
Apache Hadoop
-
Apache Spark
-
Apache YARN
Created ‎09-26-2018 05:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Exception in thread "main" java.lang.Exception: When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment. at org.apache.spark.deploy.SparkSubmitArguments.validateSubmitArguments(SparkSubmitArguments.scala:288) at org.apache.spark.deploy.SparkSubmitArguments.validateArguments(SparkSubmitArguments.scala:248) at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:120) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:130) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [spark@ip-10-0-10-76 ~]$
Created ‎09-26-2018 07:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The above error wiped out once we have added HADOOP_CONF_DIR and YARN_CONF_DIR in .bashrc file in users home directory
Created ‎09-26-2018 07:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The above error wiped out once we have added HADOOP_CONF_DIR and YARN_CONF_DIR in .bashrc file in users home directory
