Hi guys,
Again, here i am in my journey to deploy Kafka+Flume. I'm receiveing data from Kafka, which is working pretty good (as always !!!). But at Flume Agent i got the following message:
isFileClosed is not available in the version of HDFS being used. Flume will not attempt to close files if the close fails on the first attempt java.lang.NoSuchMethodException: org.apache.hadoop.fs.LocalFileSystem.isFileClosed(org.apache.hadoop.fs.Path)
at java.lang.Class.getMethod(Class.java:1665) at org.apache.flume.sink.hdfs.BucketWriter.getRefIsClosed(BucketWriter.java:180) at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:268) at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:514) at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:418) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:745)
I'm using CDH 5.3 with Kerberos.
Any clue ????
Dino