Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

hdfs block corrupt

Solved Go to solution
Highlighted

hdfs block corrupt

Super Collaborator

Hi:

We have this error when I do a quey from Hive:

errorMessage='java.io.IOException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-277687468-192.168.11.18-1438941203340:blk_1091826778_18212579; getBlockSize()=636546; corrupt=false

Also the HDFS is HEALTHY, any suggestion?

Thanks

1 ACCEPTED SOLUTION

Accepted Solutions

Re: hdfs block corrupt

Super Collaborator

Problem was solved by running the command hdfs debug recoverLease -path <path-of-the-file> [-retries <retry-times>]:

2 REPLIES 2

Re: hdfs block corrupt

Expert Contributor

@Roberto Sancho

This error can occur if you do not have adequate permissions on the file. I would suggest you run the 'fsck' command to determine if there are any corrupt blocks on HDFS.

Is it only certain queries that fail, or all queries?

Would also locate the under-lying files on HDFS, and check permissions, etc.

Re: hdfs block corrupt

Super Collaborator

Problem was solved by running the command hdfs debug recoverLease -path <path-of-the-file> [-retries <retry-times>]: