Support Questions
Find answers, ask questions, and share your expertise

getting timeout exception while reading from HBase using spark scala (with shc 1.1.1-2.1-s_2.11)

While reading from hbase I'm getting error as below

Caused by: Call to .local/:60020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=15, waitTime=59998, operationTimeout=59997 expired. at org.apache.hadoop.hbase.ipc.RpcClientImpl.wrapException( at at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod( at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod( at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan( at at at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries( at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$ at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$ at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries( ... 4 more Caused by: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=15, waitTime=59998, operationTimeout=59997 expired. at org.apache.hadoop.hbase.ipc.Call.checkAndSetTimeout( at

However I'm using below configuration in code to avoid timestamp, somehow these properties are not being overridden :

<code>hbaseConf.set("hbase.rpc.timeout", "1800000")
hbaseConf.set("hbase.client.scanner.timeout.period", "1800000")

Wondering if I'm missing something. Can anyone please help ?


Super Guru
@vivek jain

Could you make hbaseConf.set properties changes directly in Spark hbase-site.xml file instead of setting those property values in spark job and then run spark-submit with newly changed hbase-site.xml?

Hi @Shu, I wanted to do that as the last option as it will impact everyone else's work which is not nice. Isn't there a way to override those properties at all ?

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.