Support Questions

Find answers, ask questions, and share your expertise
Announcements
Welcome to the upgraded Community! Read this blog to see What’s New!

hdfs block corrupt

avatar
Super Collaborator

Hi:

We have this error when I do a quey from Hive:

errorMessage='java.io.IOException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-277687468-192.168.11.18-1438941203340:blk_1091826778_18212579; getBlockSize()=636546; corrupt=false

Also the HDFS is HEALTHY, any suggestion?

Thanks

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Problem was solved by running the command hdfs debug recoverLease -path <path-of-the-file> [-retries <retry-times>]:

View solution in original post

2 REPLIES 2

avatar
Expert Contributor

@Roberto Sancho

This error can occur if you do not have adequate permissions on the file. I would suggest you run the 'fsck' command to determine if there are any corrupt blocks on HDFS.

Is it only certain queries that fail, or all queries?

Would also locate the under-lying files on HDFS, and check permissions, etc.

avatar
Super Collaborator

Problem was solved by running the command hdfs debug recoverLease -path <path-of-the-file> [-retries <retry-times>]:

Labels