- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
SparkContext giving error in initialization : org.apache.hadoop.ipc.StandbyException
- Labels:
-
Apache Spark
-
Apache YARN
-
Apache Zookeeper
Created ‎09-06-2019 03:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
while running a normal spark job SparkContext is not getting initialized giving below error:
[Thread-6] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby.
I believe it is pointing to standby NN.
This error is coming after we enabled Namenode HA in our cloudera's cluster.
I have checked hive-site.xml, core-site.xml and spark-defaults.xml.
These all are havinf correct configs pointing to nameservice id.
Created ‎09-10-2019 11:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Can you also share the spark-defaults.conf file which is under /etc/spark/conf or /etc/spark2/conf directory for review?
Thanks
Created ‎09-06-2019 03:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Are you able to share full spark log so that we can see a bit more context? Hitting SBNN is common, but it should be back to Active NN after that, and it should not be causing failure. I suspect there is something else, so full log might help to understand a bit more.
Cheers
Eric
Created on ‎09-09-2019 12:45 AM - edited ‎09-09-2019 02:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created ‎09-10-2019 11:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Can you also share the spark-defaults.conf file which is under /etc/spark/conf or /etc/spark2/conf directory for review?
Thanks
Created ‎09-11-2019 02:08 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Eric
it was pointing to wrong spark conf so I replaced it with the new one.
But now its giving me another error.
I will open anew thread for another error.
