Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Who agreed with this topic

Hbase autentication in cluster mode

avatar
Explorer

I create a spark aplication which put data in hbase version 1.2  and I want to run it in my cluster with CDH 5.11 but I could not run the spark2-submit in cluster mode since the hbase try to autenticate in the nodes of the cluster but it can not present their credentials as valid kerberos autentication. I am compiling my application using hadoop-common 2.6 which integrate the class UserGroupInformation to manage the autentication on the cluster.

 

val configuration = HBaseConfiguration.create()
try {
//[(Array[Byte], Array[(Array[Byte], Array[Byte], Array[Byte])])]

var value = 0

while (true) {

// SETTING CONFIGURATION WITH ZKINFO , KERBEROS INFO AND OUTPUT TABLE INFO
configuration.set("hbase.zookeeper.quorum", "xxx")
configuration.set("hbase.zookeeper.property.clientPort", "2181")
configuration.set("hbase.master", "xxx")
configuration.set("hadoop.security.authentication", "kerberos")
configuration.set("hbase.security.authentication", "kerberos")
configuration.set("hbase.master.kerberos.principal", "xxx")
configuration.set("hbase.regionserver.kerberos.principal", "xxx")
configuration.set("hbase.rest.kerberos.principal", "hbase/_HOST@xxx")
configuration.set("hbase.thrift.kerberos.principal", "hbase/_HOST@xxx")
configuration.set("hbase.cluster.distributed", "true")
configuration.set("dfs.namenode.kerberos.principal.pattern", "*")
configuration.addResource("/etc/hbase/conf/core-site.xml")
configuration.addResource("/etc/hbase/conf/hbase-site.xml")
//Now you need to login/authenticate using keytab:
UserGroupInformation.setConfiguration(configuration)
val ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI("xxx", "xxx")
ugi.reloginFromKeytab(); //as per the need of the application

UserGroupInformation.loginUserFromKeytab("xxx", "xxx")

value += 1
val rdd = sc.parallelize(Array(
(Bytes.toBytes("ROW_KEY_TEST"), Array((Bytes.toBytes(columnFamily), Bytes.toBytes("COLUMN_1"), Bytes.toBytes(value.toString))))
))

@transient val job = Job.getInstance(configuration)
TableMapReduceUtil.initCredentials(job)
val broadcastedConf = sc.broadcast(new SerializableWritable(configuration))
val credentialsConf = sc.broadcast(new SerializableWritable(job.getCredentials))
val hbaseContext = new HBaseContext(broadcastedConf,credentialsConf,configuration)


hbaseContext.bulkPut[(Array[Byte], Array[(Array[Byte], Array[Byte], Array[Byte])])](rdd,
TableName.valueOf(tableName),
(putRecord) => {
val put = new Put(putRecord._1)
putRecord._2.foreach((putValue) =>
put.addColumn(putValue._1, putValue._2, putValue._3))
put
});

}

} finally {
sc.stop()
}

 

 

The error which I am obtaining are the following:

 

17/10/04 14:49:25 WARN security.UserGroupInformation: PriviledgedActionException as:exafp000 (auth:SIMPLE) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
17/10/04 14:49:25 DEBUG security.UserGroupInformation: PrivilegedAction as:exafp000 (auth:SIMPLE) from:org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:644)
17/10/04 14:49:25 WARN ipc.RpcClientImpl: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
17/10/04 14:49:25 ERROR ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:618)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:163)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:744)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:741)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:741)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:907)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:874)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1243)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:34118)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1633)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callBlockingMethod(CoprocessorRpcChannel.java:73)
at org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512)
at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:86)
at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:111)
at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:108)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:340)
at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:108)
at org.apache.hadoop.hbase.security.token.TokenUtil.addTokenForJob(TokenUtil.java:329)
at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initCredentials(TableMapReduceUtil.java:490)
at it.exage.HBaseBulkPutExampleProduction$.main(HBaseBulkPutExampleProduction.scala:133)
at it.exage.HBaseBulkPutExampleProduction.main(HBaseBulkPutExampleProduction.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:646)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)

 

Anyone with the same problem?

 

Who agreed with this topic