Support Questions

Find answers, ask questions, and share your expertise

Hbase connection error from external system when having an edge node on cluster

avatar
Contributor

Hi,

 

I am trying to connect to Hbase table from an external system using JAVA API to cloudera 5.9 hadoop cluster.

Our cluster has 2 edge nodes , 2 namenodes and 10 data nodes.

 

Zookeeper is on the edge node , name node and data node. Hbase Master on the name node.I am able to make a connection to the Hbase table through zookeeper connecting to the edge node from my local machine( not part of the cluster). But when I am trying to scan /read/write to Hbase table it is throwing the below error.

 

Can anyone help me to fix this issue?

 

org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Tue Jul 18 10:00:53 CDT 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68147: row 'tbl_name,,99999999999999' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=dn03.com,60020,1499366811145, seqNum=0

at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:264)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:199)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientSmallReversedScanner.next(ClientSmallReversedScanner.java:145)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1200)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:293)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:131)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:287)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:134)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823)
at com.seagate.dars.imagedefectanalysis.dao.HBaseDao.saveFiles2FS(HBaseDao.java:177)
at com.seagate.dars.imagedefectanalysis.ImageDefectAnalysis.main(ImageDefectAnalysis.java:40)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68147: row 'tbl_name,,99999999999999' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=dn03.com,60020,1499366811145, seqNum=0
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:294)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:275)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: unknown host: dn03.com
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.<init>(RpcClientImpl.java:296)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:129)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1278)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1152)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:31751)
at org.apache.hadoop.hbase.client.ClientSmallScanner$SmallScannerCallable.call(ClientSmallScanner.java:176)
at org.apache.hadoop.hbase.client.ClientSmallScanner$SmallScannerCallable.call(ClientSmallScanner.java:155)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 6 more

 

 

 

2 REPLIES 2

avatar
Mentor

avatar
Contributor
Should I pass the /etc/hosts (all nodes on the cluster including the edge node , name node , data node) file in the java code instead of getting it from the host I am connecting to (edge node)