Reply
Explorer
Posts: 23
Registered: ‎02-21-2014
Accepted Solution

Flafka !!!

Hi guys,

 

Again, here i am in my journey to deploy Kafka+Flume. I'm receiveing data from Kafka, which is working pretty good (as always !!!). But at Flume Agent i got the following message:

 

isFileClosed is not available in the version of HDFS being used. Flume will not attempt to close files if the close fails on the first attempt java.lang.NoSuchMethodException: org.apache.hadoop.fs.LocalFileSystem.isFileClosed(org.apache.hadoop.fs.Path)

at java.lang.Class.getMethod(Class.java:1665) at org.apache.flume.sink.hdfs.BucketWriter.getRefIsClosed(BucketWriter.java:180) at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:268) at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:514) at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:418) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:745)

 

I'm using CDH 5.3 with Kerberos.

Any clue ????

 

Dino

Highlighted
Cloudera Employee
Posts: 8
Registered: ‎07-30-2013

Re: Flafka !!!

This basically suggests that Flume won't retry closing files. This is just an info message, and nothing to worry about. In CDH 5.4, this feature was redone to not retry closing files anyway.

There is nothing you need to worry about here.