Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

avatar
Contributor

while running a normal spark job SparkContext is not getting initialized giving below error:

 [Thread-6] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby.

 

I believe it is pointing to standby NN.

This error is coming after we enabled Namenode HA in our cloudera's cluster.

I have checked hive-site.xml, core-site.xml and spark-defaults.xml.

These all are havinf correct configs pointing to nameservice id. 

1 ACCEPTED SOLUTION

avatar
Super Guru
Hmm, looks like Spark is not able to reach to NN, are both of your NN up and running? Can you run normal HDFS commands and operations?

Can you also share the spark-defaults.conf file which is under /etc/spark/conf or /etc/spark2/conf directory for review?

Thanks

View solution in original post

4 REPLIES 4

avatar
Super Guru
@yukti,

Are you able to share full spark log so that we can see a bit more context? Hitting SBNN is common, but it should be back to Active NN after that, and it should not be causing failure. I suspect there is something else, so full log might help to understand a bit more.

Cheers
Eric

avatar
Contributor

Hi Eric
please see the attached error logs.

@EricL 

avatar
Super Guru
Hmm, looks like Spark is not able to reach to NN, are both of your NN up and running? Can you run normal HDFS commands and operations?

Can you also share the spark-defaults.conf file which is under /etc/spark/conf or /etc/spark2/conf directory for review?

Thanks

avatar
Contributor

@EricL 

Hi Eric
it was pointing to wrong spark conf so I replaced it with the new one.

But now its giving me another error.
I will open anew thread for another error.