Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark Hbase connection issue

Solved Go to solution
Highlighted

Spark Hbase connection issue

Contributor

Hi All,

Hitting with followiong error while i am trying to connect the hbase through spark(using newhadoopAPIRDD) in HDP 2.4.2.Already tried increasing the RPC time in hbase site xml file,its not working.any idea how to fix this?

Exception in thread "main" org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Wed Nov 16 14:59:36 IST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=71216: row 'scores,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hklvadcnc06.hk.standardchartered.com,16020,1478491683763, seqNum=0


       at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271)
       at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:195)
       at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:59)
       at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
       at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
       at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
       at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
       at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
       at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:821)
       at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:193)
       at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:89)
       at org.apache.hadoop.hbase.client.MetaScanner.allTableRegions(MetaScanner.java:324)
       at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:88)
       at org.apache.hadoop.hbase.util.RegionSizeCalculator.init(RegionSizeCalculator.java:94)
       at org.apache.hadoop.hbase.util.RegionSizeCalculator.<init>(RegionSizeCalculator.java:81)
       at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:256)
       at org.apache.hadoop.hbase.mapreduce.TableInputFormat.getSplits(TableInputFormat.java:237)
       at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:120)
       at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
       at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
       at scala.Option.getOrElse(Option.scala:120)
       at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
       at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
       at org.apache.spark.rdd.RDD.count(RDD.scala:1157)
       at scb.Hbasetest$.main(Hbasetest.scala:85)
       at scb.Hbasetest.main(Hbasetest.scala)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=71216: row 'scores,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hklvadcnc06.hk.standardchartered.com,16020,1478491683763, seqNum=0
       at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
       at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
       at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
       at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
       at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to hklvadcnc06.hk.standardchartered.com/10.20.235.13:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to hklvadcnc06.hk.standardchartered.com/10.20.235.13:16020 is closing. Call id=9, waitTime=171
       at org.apache.hadoop.hbase.ipc.RpcClientImpl.wrapException(RpcClientImpl.java:1281)
       at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1252)
       at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
       at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
       at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651)
       at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372)
       at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
       at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
       at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
       at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346)
       at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320)
       at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
       ... 4 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to hklvadcnc06.hk.standardchartered.com/10.20.235.13:16020 is closing. Call id=9, waitTime=171
       at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.cleanupCalls(RpcClientImpl.java:1078)
       at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.close(RpcClientImpl.java:879)
       at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.run(RpcClientImpl.java:604)
16/11/16 14:59:36 INFO SparkContext: Invoking stop() from shutdown hook
1 ACCEPTED SOLUTION

Accepted Solutions

Re: Spark Hbase connection issue

Contributor

pointed hbase -conf to hadoop classpath resolved above problem .Thanks!

12 REPLIES 12

Re: Spark Hbase connection issue

Super Collaborator

Are you using secured cluster?

Re: Spark Hbase connection issue

Contributor

yes...using kerberos..

Already i have mentioned the kerberos auth in my coding

// UserGroupInformation.setConfiguration(conf)

// val userGroupInformation = UserGroupInformation.loginUserFromKeytabAndReturnUGI("hadoop1@ZONE1.SCB.NET","C:\\Users\\1554160\\Downloads\\hadoop1.keytab")

// UserGroupInformation.setLoginUser(userGroupInformation)

Re: Spark Hbase connection issue

Super Collaborator

Re: Spark Hbase connection issue

Contributor

Already i have verified the above link.But its not useful for my case .

Thanks!

Re: Spark Hbase connection issue

are you running spark job on same cluster or from different cluster? If,it's from different cluster then check if nodes on the cluster have access to lvadcnc06.hk.standardchartered.com

Re: Spark Hbase connection issue

You can try to increase the log level to DEBUG for HBase and look at the RegionServer's log which your receive the Connection closing error from. This is likely the HBase server denying your RPC for some reason. There should be a DEBUG message which informs you why the RPC was rejected.

Re: Spark Hbase connection issue

Contributor

where can i find the region server logs ?

Re: Spark Hbase connection issue

Wherever you configured them to be stored. It defaults to the standard log directory on Linux: /var/log/hbase

Re: Spark Hbase connection issue

Contributor

thanks for your reply ..issue resolved

Don't have an account?
Coming from Hortonworks? Activate your account here