I am writing a mapreduce application to access Hbase data on a CDH cluster with kerberos enable. Now, I face a a issue, hope someone can help:
My mapreduce code:
HTable table = new HTable(conf, "myTable");
Put p = new Put(Bytes.toBytes("myLittleRow"));
This above code help me to run in local mode successfully with java -jar or hadoop -jar. But when I try to run with a hue-ozzie work flow, the issue now happen:
FATAL [main] org.apache.hadoop.hbase.ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
So, I try to use init credentials to obtain valid token to run the job
But the issue still happen. Do someone have any idea?
My CDH version is 5.9.0, run on 3 nodes and kerberos version is 1.10.1
Finally i found the solution. It's easy to configure a hue-ozzie job on CDH to talk with a secure Hbase. All I need is enable Hbase credential when setup work flow and add hbase-site.xml to job.xml path.