Support Questions

Find answers, ask questions, and share your expertise

Failed to specify server's Kerberos principal name, during connecting to hbase in kerboros cluster.

avatar
Contributor

Hello,

While connecting to hbase in kerberos cluster I'm getting error below,

aused by: java.io.IOException: Couldn't setup connection for hbase/hbase1-devup.mstorm.com@MSTORM.COM to null at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:696) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:668) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:777) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651) at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) ... 4 more Caused by: java.io.IOException: Failed to specify server's Kerberos principal name at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.<init>(HBaseSaslRpcClient.java:117) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:639) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:166) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:769) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:766) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:766) ... 17 more

My conf files are:-

/etc/krb5.conf

[libdefaults]

renew_lifetime = 7d forwardable = true

default_realm = EXAMPLE.COM

ticket_lifetime = 24h

dns_lookup_realm = false

dns_lookup_kdc = false

default_ccache_name = /tmp/krb5cc_%{uid}

#default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5

#default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5

[logging]

default = FILE:/var/log/krb5kdc.log

admin_server = FILE:/var/log/kadmind.log

kdc = FILE:/var/log/krb5kdc.log

[realms]

EXAMPLE.COM = {

admin_server = ambari-server.example.com

kdc = ambari-server.example.com

}

Please help me in this.

11 REPLIES 11

avatar
Master Mentor

@Rohit Khose

Can you share how you installed your Kerberos packages?

On the KDC server, you MUST have run

# yum install krb5-server  krb5-libs

Created the Kerberos databases

# kdb5_util create -s

Then start the KDC and kadmin processes on the KDC assuming you are on Centos/redhat 7

$ systemctl enable krb5kdc 
$ systemctl start krb5kdc 
$ systemctl enable kadmin 
$ systemctl start kadmin 


Create a Kerberos Admin

On the KDC server create a KDC admin by creating an admin principal.

# kadmin.local -q "addprinc admin/admin"

And on all the clients you MUST have run

# yum install  krb5-libs krb5-workstation


Your Kerberos config is wrong starting with the /etc/krb5.conf and it should be copied to all clients hoping you run the kerberos client installation

[libdefaults]
  renew_lifetime = 7d
  forwardable = true
  default_realm = $YOUR_REALM
  ticket_lifetime = 24h
  dns_lookup_realm = false
  dns_lookup_kdc = false
  udp_preference_limit=1
[domain_realm]
  your_realm = $YOUR_REALM
  .your_realm = $YOUR_REALM
[logging]
  default = FILE:/var/log/krb5kdc.log
  admin_server = FILE:/var/log/kadmind.log
  kdc = FILE:/var/log/krb5kdc.log
[realms]
  $YOUR_REALM = {
    admin_server = your_kdc.server_FQDN
    kdc = your_kdc.server_FQDN
     }

Contents of /var/kerberos/krb5kdc/kadm5.acl:

*/admin@$YOUR_REALM *

After these steps the run the Ambari Kerberos wizard which will generate the correct keytabs in /etc/security/keytabs/* directory if you want a full documentation let me know

Hope that helps

avatar
Contributor

@Geoffrey Shelton Okot Thanks for the reply,

I've done the same configuration again, still I'm getting the same error. I've enabled debug logs and I found out below error:-

>>>Pre-Authentication Data: PA-DATA type = 136 >>>Pre-Authentication Data: PA-DATA type = 19 PA-ETYPE-INFO2 etype = 18, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 23, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 16, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null >>>Pre-Authentication Data: PA-DATA type = 2 PA-ENC-TIMESTAMP >>>Pre-Authentication Data: PA-DATA type = 133 >>> KdcAccessibility: remove ambari-devup.mstorm.com >>> KDCRep: init() encoding tag is 126 req type is 11 >>>KRBError: cTime is Sat Aug 28 17:12:22 UTC 2032 1977325942000 sTime is Wed Mar 07 10:15:19 UTC 2018 1520417719000 suSec is 507841 error code is 25 error Message is Additional pre-authentication required cname is hbase/hbase1-devup.mstorm.com@MSTORM.COM sname is krbtgt/MSTORM.COM@MSTORM.COM eData provided. msgType is 30 >>>Pre-Authentication Data: PA-DATA type = 136 >>>Pre-Authentication Data: PA-DATA type = 19 PA-ETYPE-INFO2 etype = 18, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 23, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 16, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null >>>Pre-Authentication Data: PA-DATA type = 2 PA-ENC-TIMESTAMP >>>Pre-Authentication Data: PA-DATA type = 133 KRBError received: NEEDED_PREAUTH KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ

Followed by above error, I'm using sample test example to check whether table present or not in HBase. Following is the example I am using.

package com.hbase; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.security.UserGroupInformation; import org.apache.log4j.Level; import org.apache.log4j.Logger; import org.junit.Test; public class HBaseClientTest { @Test public void testingggAuth() throws Exception{ try { Logger.getRootLogger().setLevel(Level.DEBUG); Configuration configuration = HBaseConfiguration.create(); // Zookeeper quorum configuration.set("hbase.zookeeper.quorum", "node1,node2,node3"); configuration.set("hbase.master", "hbase_node:60000"); configuration.set("hbase.zookeeper.property.clientPort", "2181"); configuration.set("hadoop.security.authentication", "kerberos"); configuration.set("hbase.security.authentication", "kerberos"); configuration.set("zookeeper.znode.parent", "/hbase"); //configuration.set("hbase.cluster.distributed", "true"); // check this setting on HBase side //configuration.set("hbase.rpc.protection", "authentication"); //what principal the master/region. servers use. //configuration.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@FIELD.HORTONWORKS.COM"); //configuration.set("hbase.regionserver.keytab.file", "src/hbase.service.keytab"); // // this is needed even if you connect over rpc/zookeeper //configuration.set("hbase.master.kerberos.principal", "_host@REALM"); //configuration.set("hbase.master.keytab.file", "/home/developers/Music/hbase.service.keytab"); System.setProperty("java.security.auth.login.config", "/path/to/hbase_master_jaas.conf"); System.setProperty("java.security.krb5.conf","/etc/krb5.conf"); // Enable/disable krb5 debugging System.setProperty("sun.security.krb5.debug", "true"); String principal = System.getProperty("kerberosPrincipal", "hbase/hbase1-devup.mstorm.com@MSTORM.COM"); String keytabLocation = System.getProperty("kerberosKeytab", "/path/to/hbase.service.keytab"); System.out.println("HEEHH 1111111111111111111"); // kinit with principal and keytab UserGroupInformation.setConfiguration(configuration); UserGroupInformation.loginUserFromKeytab(principal, keytabLocation); //UserGroupInformation.setConfiguration(conf); // UserGroupInformation userGroupInformation = UserGroupInformation.loginUserFromKeytabAndReturnUGI("hbase-ambari_devup@MSTORM.COM", "/path/to/hbase.headless.keytab" ); // UserGroupInformation.setLoginUser(userGroupInformation); System.out.println("HEEHH LOGINNNNNNNNNNNNNNNN1"); Connection connection = ConnectionFactory.createConnection(HBaseConfiguration.create(configuration)); System.out.println("CIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNN"); System.out.println("STATTTTTTTTTTTTTC "+ connection.getAdmin().isTableAvailable(TableName.valueOf("table_name"))); System.out.println("GETDATTTTTTTTTTTTTTTTAAAAAAAAAAAAAAAAAAAAAAAA"); } catch (Exception e) { e.printStackTrace(); } } }

Please help me in this as I am stuck with authenticating the remote connection to hbase in Kerberos enabled cluster. Thank you in advance.

avatar
Master Mentor

@Rohit Khose

To be able to help you can you describe your setup? OS/HDP/Ambari versions

Can you attach your /etc/krb5.conf, /var/kerberos/krb5kdc/kadm5.acl
Did you install JCE?

Where is the below FIELD.HORTONWORKS.COM coming from?

hbase.regionserver.kerberos.principal", "hbase/_HOST@FIELD.HORTONWORKS.COM")

Can you also attach the below logs

/var/log/kadmind.log

/var/log/krb5kdc.log

Did the Ambari Kerberos wizard run successfully?

avatar
Contributor

@Geoffrey Shelton Okot Now I got following error,

at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Couldn't setup connection for hbase/hbase1-devup.mstorm.com@MSTORM.COM to hbase/hbase1-devup.mstorm.com@MSTORM.COM at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:696) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:668) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:777) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651) at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.hadoop.ipc.RemoteException: GSS initiate failed at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.readStatus(HBaseSaslRpcClient.java:153) at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:189) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:642) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:166) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:769) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:766) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:766) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651) at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Results : Tests in error: HBaseClientTest.testingggAuth:51 » RetriesExhausted Failed after attempts=36, ...

Please let me know what will be the issue, Thanks in advance.

avatar
Contributor

Where is the below FIELD.HORTONWORKS.COM coming from? - I am trying with java client example to connect hbase Kerberos cluster, in that example, this field is mentioned. Also I've already installed jce on each node.

avatar
Master Mentor

@Rohit Khose

Precisely the error is because hbase is picking

use. //configuration.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@FIELD.HORTONWORKS.COM");

Can you run on the kdc server

#kadmin.local
listprincs

and check the output for hbase also check the /etc/hosts entries on your cluster it could be some DNS issue

avatar
Contributor

Output of

#kadmin.local

kadmin.local: listprincs HTTP/ambari-devup.mstorm.com@MSTORM.COM HTTP/dn1-devup.mstorm.com@MSTORM.COM HTTP/dn2-devup.mstorm.com@MSTORM.COM HTTP/dn3-devup.mstorm.com@MSTORM.COM HTTP/dn4-devup.mstorm.com@MSTORM.COM HTTP/hbase1-devup.mstorm.com@MSTORM.COM HTTP/hbase2-devup.mstorm.com@MSTORM.COM HTTP/snn-devup.mstorm.com@MSTORM.COM HTTP/zk1-devup.mstorm.com@MSTORM.COM HTTP/zk2-devup.mstorm.com@MSTORM.COM HTTP/zk3-devup.mstorm.com@MSTORM.COM K/M@MSTORM.COM admin/admin@MSTORM.COM ambari-qa-ambari_devup@MSTORM.COM ambari-server-ambari_devup@MSTORM.COM ambari-server@MSTORM.COM dn/dn1-devup.mstorm.com@MSTORM.COM dn/dn2-devup.mstorm.com@MSTORM.COM dn/dn3-devup.mstorm.com@MSTORM.COM dn/dn4-devup.mstorm.com@MSTORM.COM hbase-ambari_devup@MSTORM.COM hbase/dn1-devup.mstorm.com@MSTORM.COM hbase/dn2-devup.mstorm.com@MSTORM.COM hbase/dn3-devup.mstorm.com@MSTORM.COM hbase/dn4-devup.mstorm.com@MSTORM.COM hbase/hbase1-devup.mstorm.com@MSTORM.COM hbase/hbase2-devup.mstorm.com@MSTORM.COM hdfs-ambari_devup@MSTORM.COM hdfs/ambari-devup.mstorm.com@MSTORM.COM infra-solr/hbase2-devup.mstorm.com@MSTORM.COM jhs/hbase1-devup.mstorm.com@MSTORM.COM kadmin/admin@MSTORM.COM kadmin/ambari-devup.mstorm.com@MSTORM.COM kadmin/changepw@MSTORM.COM kafka/zk1-devup.mstorm.com@MSTORM.COM kafka/zk2-devup.mstorm.com@MSTORM.COM kafka/zk3-devup.mstorm.com@MSTORM.COM kiprop/ambari-devup.mstorm.com@MSTORM.COM krbtgt/MSTORM.COM@MSTORM.COM livy/ambari-devup.mstorm.com@MSTORM.COM nfs/dn4-devup.mstorm.com@MSTORM.COM nm/dn1-devup.mstorm.com@MSTORM.COM nm/dn2-devup.mstorm.com@MSTORM.COM nm/dn3-devup.mstorm.com@MSTORM.COM nm/dn4-devup.mstorm.com@MSTORM.COM nn/ambari-devup.mstorm.com@MSTORM.COM nn/hbase2-devup.mstorm.com@MSTORM.COM rm/ambari-devup.mstorm.com@MSTORM.COM spark-ambari_devup@MSTORM.COM yarn/snn-devup.mstorm.com@MSTORM.COM zeppelin-ambari_devup@MSTORM.COM zookeeper/zk1-devup.mstorm.com@MSTORM.COM zookeeper/zk2-devup.mstorm.com@MSTORM.COM

zookeeper/zk3-devup.mstorm.com@MSTORM.COM

avatar
Contributor

I've removed "hbase.regionserver.kerberos.principal" this line from code, still I am getting same error.

avatar
Master Mentor

@Rohit Khose

Question you are connecting from a client outside the cluster? Whats the hostname of the client ?
can you explain how you are executing from the client. Are you using a jaas configuration file?