Created on 06-06-2017 01:37 PM - edited 09-16-2022 04:42 AM
Our Hive application is failing with the following error:
[HiveServer2-Handler-Pool: Thread-3522853]: Job Submission failed with exception 'java.io.IOException(Unable to close file because the last block does not have enough number of replicas.)' java.io.IOException: Unable to close file because the last block does not have enough number of replicas. at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2600) at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2562)
We decided to try different suggestions like increasing the following settings but CM doesn't have the option to do so.
dfs.client.block.write.locateFollowingBlock.retries
dfs.client.retry.interval-ms.get-last-block-length
dfs.blockreport.incremental.intervalMsec
Any suggestion?