Support Questions
Find answers, ask questions, and share your expertise

Getting org.apache.hadoop.conf.configuration cannot be cast to org.apache.hadoop.yarn.conf.Yarnconfiguration

Getting org.apache.hadoop.conf.configuration cannot be cast to org.apache.hadoop.yarn.conf.Yarnconfiguration

New Contributor

Used docker cloudera/quickstart:latest for cloudera vm setup, I have installed spark2.4.7-hadoop2.6 and python3.6.

 

VM has pre-installed hadoop version

Hadoop 2.6.0-cdh5.7.0
Subversion http://github.com/cloudera/hadoop -r c00978c67b0d3fe9f3b896b5030741bd40bf541a
Compiled by jenkins on 2016-03-23T18:36Z
Compiled with protoc 2.5.0
From source with checksum b2eabfa328e763c88cb14168f9b372
This command was run using /usr/jars/hadoop-common-2.6.0-cdh5.7.0.jar
[root@quickstart container_1612602538205_0016_01_000001]# cd /var/log/hadoop-yarn/

 

While using pyspark command getting error - 

1/02/06 15:51:55 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
Exception in thread "main" java.lang.ClassCastException: org.apache.hadoop.conf.Configuration cannot be cast to org.apache.hadoop.yarn.conf.YarnConfiguration
at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:61)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:652)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:651)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:674)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)

 

Quite stuck with this exception, any help will be appreciated.

 

Thanks in advance.

1 REPLY 1

Re: Getting org.apache.hadoop.conf.configuration cannot be cast to org.apache.hadoop.yarn.conf.Yarnconfiguration

@Amrendra This seems a compatibility issue to me. Can you check the below matrix doc once?

 

https://docs.cloudera.com/documentation/enterprise/release-notes/topics/rn_consolidated_pcm.html#pcm...

 

https://docs.cloudera.com/documentation/spark2/latest/topics/spark2.html


Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.