While putting data inHbase, using theHTable.putmethod, I'll end up with the below exception occasionally. But the data has been actually written toHbasewhen I checked the get operation for that particularrowkey.
For the same time I have searched for the logs in both HMaster andHRegionserversto identify the issue. But unable to find that.
Please help to fine tuneHbaseConfigurations in order to avoid InterruptedIOException
java.io.InterruptedIOException:#17209, interrupted. currentNumberOfTask=1 at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1764) at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1734) at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1810) at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:240) at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:190) at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1434) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1018)
Please help to solve it
The same exception has been faced by someone . But in that thread, there is no explanation about which are configurations need to be checked in order to avoid it