Support Questions

Find answers, ask questions, and share your expertise

SASL Authentication Failed Attempting to Write From Spark to HBase SHC

avatar
New Contributor

When attempting to write a dataframe from Spark to HBase on yarn using the Spark HBase Connector. On the write stage, the jobs do not fail and will run indefinitely (one has run an entire night before I killed it in the morning), but the executors throw:

“WARN AbstractRpcClient: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ERROR AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)”

I have a valid hashed Kerberos ticket, keytab, and conf file.

Versions

Spark – 1.6.2

HDP 2.5

HBase - 1.1.2.2.5.3.0-37

SHC - 1.0.0-1.6-s_2.10

Any help would be greatly appreciated 🙂

1 REPLY 1

avatar

Please see below answer from Bikas :

https://community.hortonworks.com/questions/46500/spark-cant-connect-to-hbase-using-kerberos-in-clus...

Apache Spark 2.0 has support for automatically acquiring HBase security tokens correctly for that job and all its executors. Apache Spark 1.6 does not have that feature but in HDP Spark 1.6 we have backported that feature and it can acquire the HBase tokens for the jobs. The tokens are automatically acquired if 1) security is enabled and 2) hbase-site.xml is present on the client classpath 3) that hbase-site.xml has kerberos security configured. Then hbase tokens for the hbase master specified in that hbase-site.xml are acquired and used in the job.