Support Questions
Find answers, ask questions, and share your expertise

hdfs block corrupt

Super Collaborator

Hi:

We have this error when I do a quey from Hive:

errorMessage='java.io.IOException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-277687468-192.168.11.18-1438941203340:blk_1091826778_18212579; getBlockSize()=636546; corrupt=false

Also the HDFS is HEALTHY, any suggestion?

Thanks

1 ACCEPTED SOLUTION

Super Collaborator

Problem was solved by running the command hdfs debug recoverLease -path <path-of-the-file> [-retries <retry-times>]:

View solution in original post

2 REPLIES 2

Expert Contributor

@Roberto Sancho

This error can occur if you do not have adequate permissions on the file. I would suggest you run the 'fsck' command to determine if there are any corrupt blocks on HDFS.

Is it only certain queries that fail, or all queries?

Would also locate the under-lying files on HDFS, and check permissions, etc.

Super Collaborator

Problem was solved by running the command hdfs debug recoverLease -path <path-of-the-file> [-retries <retry-times>]:

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.