Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Solved Go to solution

SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Explorer

while running a normal spark job SparkContext is not getting initialized giving below error:

 [Thread-6] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby.

 

I believe it is pointing to standby NN.

This error is coming after we enabled Namenode HA in our cloudera's cluster.

I have checked hive-site.xml, core-site.xml and spark-defaults.xml.

These all are havinf correct configs pointing to nameservice id. 

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Guru
Hmm, looks like Spark is not able to reach to NN, are both of your NN up and running? Can you run normal HDFS commands and operations?

Can you also share the spark-defaults.conf file which is under /etc/spark/conf or /etc/spark2/conf directory for review?

Thanks
4 REPLIES 4

Re: SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Guru
@yukti,

Are you able to share full spark log so that we can see a bit more context? Hitting SBNN is common, but it should be back to Active NN after that, and it should not be causing failure. I suspect there is something else, so full log might help to understand a bit more.

Cheers
Eric

Re: SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Explorer

Hi Eric
please see the attached error logs.

@EricL 

Highlighted

Re: SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Guru
Hmm, looks like Spark is not able to reach to NN, are both of your NN up and running? Can you run normal HDFS commands and operations?

Can you also share the spark-defaults.conf file which is under /etc/spark/conf or /etc/spark2/conf directory for review?

Thanks

Re: SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException

Explorer

@EricL 

Hi Eric
it was pointing to wrong spark conf so I replaced it with the new one.

But now its giving me another error.
I will open anew thread for another error.