Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Who agreed with this topic

Hive fails due to not have enough number of replicas in HDFS

avatar
New Contributor

Our Hive application is failing with the following error: 

 

[HiveServer2-Handler-Pool: Thread-3522853]: Job Submission failed with exception 'java.io.IOException(Unable to close file because the last block does not have enough number of replicas.)'
java.io.IOException: Unable to close file because the last block does not have enough number of replicas.
	at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2600)
	at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2562)

 

 We decided to try different suggestions like increasing the following settings but CM doesn't have the option to do so.

 

dfs.client.block.write.locateFollowingBlock.retries
dfs.client.retry.interval-ms.get-last-block-length
dfs.blockreport.incremental.intervalMsec 

Any suggestion

Who agreed with this topic