Support Questions

Find answers, ask questions, and share your expertise

Failed to specify server's Kerberos principal name, during connecting to hbase in kerboros cluster.

avatar
Contributor

Hello,

While connecting to hbase in kerberos cluster I'm getting error below,

aused by: java.io.IOException: Couldn't setup connection for hbase/hbase1-devup.mstorm.com@MSTORM.COM to null at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:696) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:668) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:777) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651) at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) ... 4 more Caused by: java.io.IOException: Failed to specify server's Kerberos principal name at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.<init>(HBaseSaslRpcClient.java:117) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:639) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:166) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:769) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:766) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:766) ... 17 more

My conf files are:-

/etc/krb5.conf

[libdefaults]

renew_lifetime = 7d forwardable = true

default_realm = EXAMPLE.COM

ticket_lifetime = 24h

dns_lookup_realm = false

dns_lookup_kdc = false

default_ccache_name = /tmp/krb5cc_%{uid}

#default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5

#default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5

[logging]

default = FILE:/var/log/krb5kdc.log

admin_server = FILE:/var/log/kadmind.log

kdc = FILE:/var/log/krb5kdc.log

[realms]

EXAMPLE.COM = {

admin_server = ambari-server.example.com

kdc = ambari-server.example.com

}

Please help me in this.

11 REPLIES 11

avatar
Contributor

I had the same issue with Spark2 and HDP3.1, using Isilon/OneFS as storage instead of HDFS.

The OneFS service management pack doesn't provide configuration for some of the HDFS parameters that are expected by Spark2 (they aren't available at all in Ambari), such as dfs.datanode.kerberos.principal. Without these parameters Spark2 HistoryServer may fail to start and report errors such as "Failed to specify server's principal name".

I added the following properties to OneFS under Custom hdfs-site:

dfs.datanode.kerberos.principal=hdfs/_HOST@<MY REALM>
dfs.datanode.keytab.file=/etc/security/keytabs/hdfs.service.keytab dfs.namenode.kerberos.principal=hdfs/_HOST@<MY REALM>
dfs.namenode.keytab.file=/etc/security/keytabs/hdfs.service.keytab 

This resolved the initial error. Thereafter I was getting an error of the following form:

Server has invalid Kerberos principal: hdfs/<isilon>.my.realm.com@my.realm.com, expecting: hdfs/somewhere.else.entirely@my.realm.com

This was related to cross-realm authentication. Resolved by adding the below setting to custom hdfs-site:

dfs.namenode.kerberos.principal.pattern=*

(Reposting my answer from https://stackoverflow.com/questions/35325720/connecting-to-kerberrized-hdfs-java-lang-illegalargumen... )

avatar
Contributor

In OPs case, it might be that the hdfs-site files need to be available when trying to connect to HBase.
If I recall correctly, some HBase clients (such as the NiFi processor) need the Hadoop configuration files core-site and hdfs-site to be specified. If they can't be found or don't have the attributes above, it might cause the same error.