Member since
10-04-2016
24
Posts
1
Kudos Received
0
Solutions
12-29-2018
12:16 PM
@Jay Kumar SenSharma I'm doing it by just downloading hbase-2.1.1 version and replacing it by existing hbase-2.0.0 at /usr/hdp/3.0.1.0-187/ location. Is it possible or not? Now Masters are started but regionservers not starting, showing following error, SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
1 [regionserver/ubuntu20:60020] ERROR org.apache.hadoop.hbase.regionserver.HRegionServer - ***** ABORTING region server ubuntu20.mcloud.com,60020,1546085700471: Unhandled: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected *****
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:768)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:118)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$16.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:848)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$16.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:843)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:856)
at org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:51)
at org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:167)
at org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:165)
at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:612)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:124)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:756)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:486)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.<init>(AsyncFSWAL.java:251)
at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createWAL(AsyncFSWALProvider.java:73)
at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createWAL(AsyncFSWALProvider.java:48)
at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:138)
at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:57)
at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:276)
at org.apache.hadoop.hbase.regionserver.HRegionServer.getWAL(HRegionServer.java:2100)
at org.apache.hadoop.hbase.regionserver.HRegionServer.buildServerLoad(HRegionServer.java:1311)
at org.apache.hadoop.hbase.regionserver.HRegionServer.tryRegionServerReport(HRegionServer.java:1193)
at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:1013)
at java.lang.Thread.run(Thread.java:745)
4 [regionserver/ubuntu20:60020] ERROR org.apache.hadoop.hbase.regionserver.HRegionServer - RegionServer abort: loaded coprocessors are: []
101 [main] ERROR org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine - Region server exiting
java.lang.RuntimeException: HRegionServer Aborted
at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.start(HRegionServerCommandLine.java:67)
at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.run(HRegionServerCommandLine.java:87)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149)
at org.apache.hadoop.hbase.regionserver.HRegionServer.main(HRegionServer.java:3021)
... View more
12-29-2018
11:47 AM
@Jay Kumar SenSharma Thank you for your reply. Tried above all outputs, but still getting the same error. Do you have any other suggestion/solution over this?
... View more
12-29-2018
11:07 AM
Getting below error while starting Hbase service, SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-pig.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-5.0.0.3.0.1.0-187-thin-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/phoenix-thin-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Exception in thread "main" java.lang.NoSuchMethodError: com.ctc.wstx.io.StreamBootstrapper.getInstance(Ljava/lang/String;Lcom/ctc/wstx/io/SystemId;Ljava/io/InputStream;)Lcom/ctc/wstx/io/StreamBootstrapper;
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2918)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2901)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2953)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2926)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2806)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200)
at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1254)
at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1660)
at org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:66)
at org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:80)
at org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:94)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:3126) Please help me in this, Thanks in advance.
... View more
Labels:
04-13-2018
11:06 AM
@Venkata Sudheer Kumar M You can use --files parameter while deploying applications on Yarn like, spark-submit --class com.virtuslab.sparksql.MainClass--master yarn --deploy-mode cluster --files /etc/spark2/conf/hive-site.xml,/etc/spark2/conf/hbase-site.xml /tmp/spark-hive-test/spark_sql_under_the_hood-spark2.2.0.jar It worked in my case.
... View more
03-08-2018
10:22 AM
I've removed "hbase.regionserver.kerberos.principal" this line from code, still I am getting same error.
... View more
03-08-2018
10:22 AM
Output of #kadmin.local kadmin.local: listprincs
HTTP/ambari-devup.mstorm.com@MSTORM.COM
HTTP/dn1-devup.mstorm.com@MSTORM.COM
HTTP/dn2-devup.mstorm.com@MSTORM.COM
HTTP/dn3-devup.mstorm.com@MSTORM.COM
HTTP/dn4-devup.mstorm.com@MSTORM.COM
HTTP/hbase1-devup.mstorm.com@MSTORM.COM
HTTP/hbase2-devup.mstorm.com@MSTORM.COM
HTTP/snn-devup.mstorm.com@MSTORM.COM
HTTP/zk1-devup.mstorm.com@MSTORM.COM
HTTP/zk2-devup.mstorm.com@MSTORM.COM
HTTP/zk3-devup.mstorm.com@MSTORM.COM
K/M@MSTORM.COM
admin/admin@MSTORM.COM
ambari-qa-ambari_devup@MSTORM.COM
ambari-server-ambari_devup@MSTORM.COM
ambari-server@MSTORM.COM
dn/dn1-devup.mstorm.com@MSTORM.COM
dn/dn2-devup.mstorm.com@MSTORM.COM
dn/dn3-devup.mstorm.com@MSTORM.COM
dn/dn4-devup.mstorm.com@MSTORM.COM
hbase-ambari_devup@MSTORM.COM
hbase/dn1-devup.mstorm.com@MSTORM.COM
hbase/dn2-devup.mstorm.com@MSTORM.COM
hbase/dn3-devup.mstorm.com@MSTORM.COM
hbase/dn4-devup.mstorm.com@MSTORM.COM
hbase/hbase1-devup.mstorm.com@MSTORM.COM
hbase/hbase2-devup.mstorm.com@MSTORM.COM
hdfs-ambari_devup@MSTORM.COM
hdfs/ambari-devup.mstorm.com@MSTORM.COM
infra-solr/hbase2-devup.mstorm.com@MSTORM.COM
jhs/hbase1-devup.mstorm.com@MSTORM.COM
kadmin/admin@MSTORM.COM
kadmin/ambari-devup.mstorm.com@MSTORM.COM
kadmin/changepw@MSTORM.COM
kafka/zk1-devup.mstorm.com@MSTORM.COM
kafka/zk2-devup.mstorm.com@MSTORM.COM
kafka/zk3-devup.mstorm.com@MSTORM.COM
kiprop/ambari-devup.mstorm.com@MSTORM.COM
krbtgt/MSTORM.COM@MSTORM.COM
livy/ambari-devup.mstorm.com@MSTORM.COM
nfs/dn4-devup.mstorm.com@MSTORM.COM
nm/dn1-devup.mstorm.com@MSTORM.COM
nm/dn2-devup.mstorm.com@MSTORM.COM
nm/dn3-devup.mstorm.com@MSTORM.COM
nm/dn4-devup.mstorm.com@MSTORM.COM
nn/ambari-devup.mstorm.com@MSTORM.COM
nn/hbase2-devup.mstorm.com@MSTORM.COM
rm/ambari-devup.mstorm.com@MSTORM.COM
spark-ambari_devup@MSTORM.COM
yarn/snn-devup.mstorm.com@MSTORM.COM
zeppelin-ambari_devup@MSTORM.COM
zookeeper/zk1-devup.mstorm.com@MSTORM.COM
zookeeper/zk2-devup.mstorm.com@MSTORM.COM zookeeper/zk3-devup.mstorm.com@MSTORM.COM
... View more
03-08-2018
08:56 AM
Where is the below FIELD.HORTONWORKS.COM coming from? - I am trying with java client example to connect hbase Kerberos cluster, in that example, this field is mentioned. Also I've already installed jce on each node.
... View more
03-08-2018
08:52 AM
@Geoffrey Shelton Okot Now I got following error, at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Couldn't setup connection for hbase/hbase1-devup.mstorm.com@MSTORM.COM to hbase/hbase1-devup.mstorm.com@MSTORM.COM
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:696)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:668)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:777)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.ipc.RemoteException: GSS initiate failed
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.readStatus(HBaseSaslRpcClient.java:153)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:189)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:642)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:166)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:769)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:766)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:766)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Results :
Tests in error:
HBaseClientTest.testingggAuth:51 » RetriesExhausted Failed after attempts=36, ... Please let me know what will be the issue, Thanks in advance.
... View more
03-07-2018
01:17 PM
KrbException: Server not found in Kerberos database (7) - LOOKING_UP_SERVER
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
at org.apache.zookeeper.client.ZooKeeperSaslClient$2.run(ZooKeeperSaslClient.java:366)
at org.apache.zookeeper.client.ZooKeeperSaslClient$2.run(ZooKeeperSaslClient.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.zookeeper.client.ZooKeeperSaslClient.createSaslToken(ZooKeeperSaslClient.java:362)
at org.apache.zookeeper.client.ZooKeeperSaslClient.createSaslToken(ZooKeeperSaslClient.java:348)
at org.apache.zookeeper.client.ZooKeeperSaslClient.sendSaslPacket(ZooKeeperSaslClient.java:420)
at org.apache.zookeeper.client.ZooKeeperSaslClient.initialize(ZooKeeperSaslClient.java:458)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1013)
Caused by: KrbException: Identifier doesn't match expected value (906)
at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
03-07-2018
10:26 AM
@Geoffrey Shelton Okot Thanks for the reply, I've done the same configuration again, still I'm getting the same error. I've enabled debug logs and I found out below error:- >>>Pre-Authentication Data: PA-DATA type = 136 >>>Pre-Authentication Data: PA-DATA type = 19 PA-ETYPE-INFO2 etype = 18, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 23, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 16, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null >>>Pre-Authentication Data: PA-DATA type = 2 PA-ENC-TIMESTAMP >>>Pre-Authentication Data: PA-DATA type = 133 >>> KdcAccessibility: remove ambari-devup.mstorm.com >>> KDCRep: init() encoding tag is 126 req type is 11 >>>KRBError: cTime is Sat Aug 28 17:12:22 UTC 2032 1977325942000 sTime is Wed Mar 07 10:15:19 UTC 2018 1520417719000 suSec is 507841 error code is 25 error Message is Additional pre-authentication required cname is hbase/hbase1-devup.mstorm.com@MSTORM.COM sname is krbtgt/MSTORM.COM@MSTORM.COM eData provided. msgType is 30 >>>Pre-Authentication Data: PA-DATA type = 136 >>>Pre-Authentication Data: PA-DATA type = 19 PA-ETYPE-INFO2 etype = 18, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 23, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null PA-ETYPE-INFO2 etype = 16, salt = MSTORM.COMhbasehbase1-devup.mstorm.com, s2kparams = null >>>Pre-Authentication Data: PA-DATA type = 2 PA-ENC-TIMESTAMP >>>Pre-Authentication Data: PA-DATA type = 133 KRBError received: NEEDED_PREAUTH KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ Followed by above error, I'm using sample test example to check whether table present or not in HBase. Following is the example I am using. package com.hbase; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.security.UserGroupInformation; import org.apache.log4j.Level; import org.apache.log4j.Logger; import org.junit.Test; public class HBaseClientTest { @Test public void testingggAuth() throws Exception{ try { Logger.getRootLogger().setLevel(Level.DEBUG); Configuration configuration = HBaseConfiguration.create(); // Zookeeper quorum configuration.set("hbase.zookeeper.quorum", "node1,node2,node3"); configuration.set("hbase.master", "hbase_node:60000"); configuration.set("hbase.zookeeper.property.clientPort", "2181"); configuration.set("hadoop.security.authentication", "kerberos"); configuration.set("hbase.security.authentication", "kerberos"); configuration.set("zookeeper.znode.parent", "/hbase"); //configuration.set("hbase.cluster.distributed", "true"); // check this setting on HBase side //configuration.set("hbase.rpc.protection", "authentication"); //what principal the master/region. servers use. //configuration.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@FIELD.HORTONWORKS.COM"); //configuration.set("hbase.regionserver.keytab.file", "src/hbase.service.keytab"); // // this is needed even if you connect over rpc/zookeeper //configuration.set("hbase.master.kerberos.principal", "_host@REALM"); //configuration.set("hbase.master.keytab.file", "/home/developers/Music/hbase.service.keytab"); System.setProperty("java.security.auth.login.config", "/path/to/hbase_master_jaas.conf"); System.setProperty("java.security.krb5.conf","/etc/krb5.conf"); // Enable/disable krb5 debugging System.setProperty("sun.security.krb5.debug", "true"); String principal = System.getProperty("kerberosPrincipal", "hbase/hbase1-devup.mstorm.com@MSTORM.COM"); String keytabLocation = System.getProperty("kerberosKeytab", "/path/to/hbase.service.keytab"); System.out.println("HEEHH 1111111111111111111"); // kinit with principal and keytab UserGroupInformation.setConfiguration(configuration); UserGroupInformation.loginUserFromKeytab(principal, keytabLocation); //UserGroupInformation.setConfiguration(conf); // UserGroupInformation userGroupInformation = UserGroupInformation.loginUserFromKeytabAndReturnUGI("hbase-ambari_devup@MSTORM.COM", "/path/to/hbase.headless.keytab" ); // UserGroupInformation.setLoginUser(userGroupInformation); System.out.println("HEEHH LOGINNNNNNNNNNNNNNNN1"); Connection connection = ConnectionFactory.createConnection(HBaseConfiguration.create(configuration)); System.out.println("CIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNN"); System.out.println("STATTTTTTTTTTTTTC "+ connection.getAdmin().isTableAvailable(TableName.valueOf("table_name"))); System.out.println("GETDATTTTTTTTTTTTTTTTAAAAAAAAAAAAAAAAAAAAAAAA"); } catch (Exception e) { e.printStackTrace(); } } } Please help me in this as I am stuck with authenticating the remote connection to hbase in Kerberos enabled cluster. Thank you in advance.
... View more