Reply
Champion Alumni
Posts: 196
Registered: ‎11-18-2014

Cannot obtain block length - error not persistent

[ Edited ]

 

Hello,

 

Today my job failed because:

Cannot obtain block length for LocatedBlock{BP-1623273649-<IP>-1419337015794:blk_1076750425_3012802; getBlockSize()=16188; corrupt=false; offset=0; locs=[<IP-Worker1>:50010, <IP-Worker-2>:50010, <IP-WORKER-3>:50010]}

I tried to find the filename:

hdfs fsck / -files -blocks | grep 'BP-1623273649-<IP>-1419337015794:blk_1076750425'

but there is nothing(no file found with this block ID).

 

I restarted the job twice and the second time it worked. I didn't change anything. The cluster is in HA and the NN didn't change.

 

 

Would you please explain to me what could have happened? 

 

Thank you,

 

GHERMAN Alina
Highlighted
Champion Alumni
Posts: 196
Registered: ‎11-18-2014

Re: Cannot obtain block length - error not persistent

Hello,

 

I just had this one more time, and the source problem is :

2016-03-07 14:00:09,285 INFO org.apache.flume.sink.hdfs.BucketWriter: Creating hdfs://NameNode/path/to/file.json.tmp
2016-03-07 14:16:30,389 WARN org.apache.flume.sink.hdfs.BucketWriter: Caught IOException writing to HDFSWriter (Connection reset by peer). Closing file (hdfs://NameNode/path/to/file.json.tmp) and rethrowing exception.
2016-03-07 14:16:30,389 INFO org.apache.flume.sink.hdfs.BucketWriter: Closing hdfs://NameNode/path/to/file.json.tmp
2016-03-07 14:16:30,389 WARN org.apache.flume.sink.hdfs.BucketWriter: failed to close() HDFSWriter for file (hdfs://NameNode/path/to/file.json.tmp). Exception follows.
2016-03-07 14:16:30,390 INFO org.apache.flume.sink.hdfs.BucketWriter: Renaming hdfs://NameNode/path/to/file.json.tmp to hdfs://NameNode/path/to/file.json

I have this problem for all files for wich I had the error that I put in bold.

 

Alina

GHERMAN Alina
Champion Alumni
Posts: 196
Registered: ‎11-18-2014

Re: Cannot obtain block length - error not persistent

I think that I found a possible cause of the problem.

 

At the hour indicated by Flume logs, the Flume service was OK ( nothing special), however, at the hour indicated by flume logs I restarted HDFS in order to deploy a new configuraiton...

 

However, I still have no solution for the lost files (other than not take them into account)...

 

Alina

GHERMAN Alina
zbz
Explorer
Posts: 16
Registered: ‎10-07-2018

Re: Cannot obtain block length - error not persistent

Hi ,when you look at the error  'Caused by: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-19720802xxxxxxxxxx-xxxx-xxxx1:blk_1326223979_253064188'

 run those cmd :

 

1. hdfs fsck -blockId   blk_1326223979

    then you will see one file path that bock belong.

 

2. hdfs debug recoverLease  -path theFilePath retries 3

 

and all is ok.

Announcements
New solutions