Created 09-06-2019 03:11 AM
while running a normal spark job SparkContext is not getting initialized giving below error:
[Thread-6] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby.
I believe it is pointing to standby NN.
This error is coming after we enabled Namenode HA in our cloudera's cluster.
I have checked hive-site.xml, core-site.xml and spark-defaults.xml.
These all are havinf correct configs pointing to nameservice id.
Created 09-10-2019 11:54 PM
Created 09-06-2019 03:38 PM
Created on 09-09-2019 12:45 AM - edited 09-09-2019 02:03 AM
Hi Eric
please see the attached error logs.
Created 09-10-2019 11:54 PM
Created 09-11-2019 02:08 AM
Hi Eric
it was pointing to wrong spark conf so I replaced it with the new one.
But now its giving me another error.
I will open anew thread for another error.