Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HBase CallTimeoutException for Hive on HBase tables

avatar
Contributor

Hello Team,

 

We are experiencing an issue with tables for Hive on HBase where we get the following issue, Note: The pure Hive tables and query on HBase are working fine.

 

 

 

Failed after attempts=11, exceptions:
2022-02-28T12:12:54.719Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60115: Call to rs.host.500/rs.host.ip.500:16020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call[id=5,methodName=Scan], waitTime=60002, rpcTimeout=59991 row '' on table 'a-hbase-table' at region=a-hbase-table,,1598840250675.94c031e70c63dbb0f4726251987eb4ec., hostname=rs.host.500,16020,1645349772604, seqNum=550418

org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=11, exceptions:
2022-02-28T12:12:54.719Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60115: Call to rs.host.500/rs.host.ip.500:16020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call[id=5,methodName=Scan], waitTime=60002, rpcTimeout=59991 row '' on table 'a-hbase-table' at region=a-hbase-table,,1598840250675.94c031e70c63dbb0f4726251987eb4ec., hostname=rs.host.500,16020,1645349772604, seqNum=550418
	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
	at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)
	at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)
	at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)
	at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)
	at org.apache.hadoop.hbase.client.ResultScanner.next(ResultScanner.java:97)
	at org.apache.hadoop.hbase.thrift.ThriftHBaseServiceHandler.scannerGetList(ThriftHBaseServiceHandler.java:858)
	at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
	....
    ....
    at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=60115: Call to rs.host.500/rs.host.ip.500:16020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call[id=5,methodName=Scan], waitTime=60002, rpcTimeout=59991 row '' on table 'a-hbase-table' at region=a-hbase-table,,1598840250675.94c031e70c63dbb0f4726251987eb4ec., hostname=rs.host.500,16020,1645349772604, seqNum=550418
	at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more
Caused by: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call to rs.host.500/rs.host.ip.500:16020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call[id=5,methodName=Scan], waitTime=60002, rpcTimeout=59991
	at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:209)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)
	at org.apache.hadoop.hbase.ipc.Call.setTimeout(Call.java:110)
	at org.apache.hadoop.hbase.ipc.RpcConnection$1.run(RpcConnection.java:136)
	at org.apache.hbase.thirdparty.io.netty.util.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:672)
	at org.apache.hbase.thirdparty.io.netty.util.HashedWheelTimer$HashedWheelBucket.expireTimeouts(HashedWheelTimer.java:747)
	at org.apache.hbase.thirdparty.io.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:472)
	... 1 more
Caused by: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call[id=5,methodName=Scan], waitTime=60002, rpcTimeout=59991
	at org.apache.hadoop.hbase.ipc.RpcConnection$1.run(RpcConnection.java:137)
	... 4 more

 

 

 

We increased the hbase.regionserver.handler.count property from 30 to 48, and hbase.rpc.timeout from 60 seconds to 90 seconds.

 

Note that, I already checked the RS logs which were shown in the logs but I haven't found any issue there. But, we still see the above error occurring. Also, though the RPM timeout is set to 90 seconds, still the timeout is showing 60 seconds.

 

 

Can you please share a solution for this?

 

Best Regards

 

 

 

1 ACCEPTED SOLUTION

avatar
Super Collaborator
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
2 REPLIES 2

avatar
Super Collaborator
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login

avatar
Super Collaborator

Hello @Sayed016 

 

We are marking the Post as Solved for now. If you encounter any issues after adding the 2 Parameters as shared, Feel free to share the same as well.

 

Regards, Smarak