We are working on a Kerberized CDH 5.8.3 cluster (that is a test cluster) and we are facing the Spark-HBase error:
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
Our Scala program works well when we submit it in client-mode; but it fails when we submit it in cluster-mode.
The Scala program mainly uses the hbaseForeachPartition method doing put commands.
We read tons of blogs, suggestions, information, explanations regarding this problem and we tried several, several different options.
Unfortunately no ones worked out.
I would like to have a confirmation that it is not possible running a Spark-HBase program in cluster-mode when the Cluster is Kerberized.
Otherwise, I would like to have a pointer to the right solution.
I am also facing the same issue in Java code. Not able to access HBase from Spark code.
Below is the post for more details.
I am using the cloudera generated keytab and principal.Cloud you tell which keytab and principal you are using?