Member since
06-06-2017
2
Posts
0
Kudos Received
0
Solutions
06-07-2017
08:53 AM
It seems like if we are not confident that changing those configuration will solve the HDFS IO Exception (in our Hive application) its better to not mess around with them. Do you have any suggestion about the root couse of our exception?
... View more
06-06-2017
01:37 PM
Our Hive application is failing with the following error: [HiveServer2-Handler-Pool: Thread-3522853]: Job Submission failed with exception 'java.io.IOException(Unable to close file because the last block does not have enough number of replicas.)'
java.io.IOException: Unable to close file because the last block does not have enough number of replicas.
at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2600)
at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2562) We decided to try different suggestions like increasing the following settings but CM doesn't have the option to do so. dfs.client.block.write.locateFollowingBlock.retries dfs.client.retry.interval-ms.get-last-block-length dfs.blockreport.incremental.intervalMsec Any suggestion?
... View more
Labels:
- Labels:
-
Apache Hive
-
HDFS