Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

getting timeout exception while reading from HBase using spark scala (with shc 1.1.1-2.1-s_2.11)

avatar
Contributor

While reading from hbase I'm getting error as below

Caused by: java.io.IOException: Call to .local/:60020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=15, waitTime=59998, operationTimeout=59997 expired. at org.apache.hadoop.hbase.ipc.RpcClientImpl.wrapException(RpcClientImpl.java:1263) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1231) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:218) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:292) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32831) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:219) at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:63) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:211) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:396) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:370) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136) ... 4 more Caused by: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=15, waitTime=59998, operationTimeout=59997 expired. at org.apache.hadoop.hbase.ipc.Call.checkAndSetTimeout(Call.java:70) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1205)

However I'm using below configuration in code to avoid timestamp, somehow these properties are not being overridden :

<code>hbaseConf.set("hbase.rpc.timeout", "1800000")
hbaseConf.set("hbase.client.scanner.timeout.period", "1800000")

Wondering if I'm missing something. Can anyone please help ?

2 REPLIES 2

avatar
Master Guru
@vivek jain

Could you make hbaseConf.set properties changes directly in Spark hbase-site.xml file instead of setting those property values in spark job and then run spark-submit with newly changed hbase-site.xml?

avatar
Contributor

Hi @Shu, I wanted to do that as the last option as it will impact everyone else's work which is not nice. Isn't there a way to override those properties at all ?