<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86388#M45252</link>
    <description>&lt;P&gt;Hello Everyone,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The scenario I'm trying to do is as follows:&lt;/P&gt;&lt;P&gt;1- Flume&amp;nbsp;TAILDIR Source reading from a log file and appending a static interceptor to the beginning of the message. The interceptor consists of the host name and the host IP cause it's required with every log message I receive.&lt;/P&gt;&lt;P&gt;2-&amp;nbsp;Flume Kafka Producer Sink that take those messages from the file and put them in a Kafka topic.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The&amp;nbsp;Flume configuration is as follows:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;tier1.sources=source1
tier1.channels=channel1
tier1.sinks =sink1

tier1.sources.source1.interceptors=i1


tier1.sources.source1.interceptors.i1.type=static
tier1.sources.source1.interceptors.i1.key=HostData
tier1.sources.source1.interceptors.i1.value=###HostName###000.00.0.000###


tier1.sources.source1.type=TAILDIR
tier1.sources.source1.positionFile=/usr/software/flumData/flumeStressAndKafkaFailureTestPos.json
tier1.sources.source1.filegroups=f1
tier1.sources.source1.filegroups.f1=/usr/software/flumData/flumeStressAndKafkaFailureTest.txt
tier1.sources.source1.channels=channel1

tier1.channels.channel1.type=file
tier1.channels.channel1.checkpointDir = /usr/software/flumData/checkpoint
tier1.channels.channel1.dataDirs = /usr/software/flumData/data



tier1.sinks.sink1.channel=channel1
tier1.sinks.sink1.type=org.apache.flume.sink.kafka.KafkaSink
tier1.sinks.sink1.kafka.bootstrap.servers=&amp;lt;Removed For Confidentiality &amp;gt;
tier1.sinks.sink1.kafka.topic=FlumeTokafkaTest
tier1.sinks.sink1.kafka.flumeBatchSize=20
tier1.sinks.sink1.kafka.producer.acks=0
tier1.sinks.sink1.useFlumeEventFormat=true
tier1.sinks.sink1.kafka.producer.linger.ms=1
tier1.sinks.sink1.kafka.producer.client.id=HOSTNAME
tier1.sinks.sink1.kafka.producer.compression.type = snappy

&lt;/PRE&gt;&lt;P&gt;So now I'm testing, I ran a Console Kafka Consumer and I started to write in the source file and I do receive the message with the header appended.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Example:&lt;/P&gt;&lt;P&gt;I write 'test' in the source file and press Enter then save the file&lt;/P&gt;&lt;P&gt;Flume detect the file change, then it sends the new line to Kafka producer.&lt;/P&gt;&lt;P&gt;My consumer get the following line:&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;###HostName###000.00.0.000###test&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;The&amp;nbsp;issue now is that sometimes,&lt;/STRONG&gt; the interceptor doesn't work as expected. It's like Flume sends 2 messages, one contains the interceptor and the other one the message content.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Example:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I write 'hi you'&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;in the source file and press Enter then save the file&lt;/P&gt;&lt;P&gt;Flume detect the file change, then it sends the new line to Kafka producer.&lt;/P&gt;&lt;P&gt;My consumer get the following 2 line:&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;###HostName###000.00.0.000###&lt;BR /&gt;hi you&lt;/PRE&gt;&lt;P&gt;And the terminal scrolls to the the new message&amp;nbsp;content.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This case always happen when I type 'hi you' in the text file, and since I read from a log file, then it's not predictable when it happens.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Help and support will be much appreciated ^^&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 14:09:12 GMT</pubDate>
    <dc:creator>Matrix</dc:creator>
    <dc:date>2022-09-16T14:09:12Z</dc:date>
    <item>
      <title>Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86388#M45252</link>
      <description>&lt;P&gt;Hello Everyone,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The scenario I'm trying to do is as follows:&lt;/P&gt;&lt;P&gt;1- Flume&amp;nbsp;TAILDIR Source reading from a log file and appending a static interceptor to the beginning of the message. The interceptor consists of the host name and the host IP cause it's required with every log message I receive.&lt;/P&gt;&lt;P&gt;2-&amp;nbsp;Flume Kafka Producer Sink that take those messages from the file and put them in a Kafka topic.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The&amp;nbsp;Flume configuration is as follows:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;tier1.sources=source1
tier1.channels=channel1
tier1.sinks =sink1

tier1.sources.source1.interceptors=i1


tier1.sources.source1.interceptors.i1.type=static
tier1.sources.source1.interceptors.i1.key=HostData
tier1.sources.source1.interceptors.i1.value=###HostName###000.00.0.000###


tier1.sources.source1.type=TAILDIR
tier1.sources.source1.positionFile=/usr/software/flumData/flumeStressAndKafkaFailureTestPos.json
tier1.sources.source1.filegroups=f1
tier1.sources.source1.filegroups.f1=/usr/software/flumData/flumeStressAndKafkaFailureTest.txt
tier1.sources.source1.channels=channel1

tier1.channels.channel1.type=file
tier1.channels.channel1.checkpointDir = /usr/software/flumData/checkpoint
tier1.channels.channel1.dataDirs = /usr/software/flumData/data



tier1.sinks.sink1.channel=channel1
tier1.sinks.sink1.type=org.apache.flume.sink.kafka.KafkaSink
tier1.sinks.sink1.kafka.bootstrap.servers=&amp;lt;Removed For Confidentiality &amp;gt;
tier1.sinks.sink1.kafka.topic=FlumeTokafkaTest
tier1.sinks.sink1.kafka.flumeBatchSize=20
tier1.sinks.sink1.kafka.producer.acks=0
tier1.sinks.sink1.useFlumeEventFormat=true
tier1.sinks.sink1.kafka.producer.linger.ms=1
tier1.sinks.sink1.kafka.producer.client.id=HOSTNAME
tier1.sinks.sink1.kafka.producer.compression.type = snappy

&lt;/PRE&gt;&lt;P&gt;So now I'm testing, I ran a Console Kafka Consumer and I started to write in the source file and I do receive the message with the header appended.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Example:&lt;/P&gt;&lt;P&gt;I write 'test' in the source file and press Enter then save the file&lt;/P&gt;&lt;P&gt;Flume detect the file change, then it sends the new line to Kafka producer.&lt;/P&gt;&lt;P&gt;My consumer get the following line:&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;###HostName###000.00.0.000###test&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;The&amp;nbsp;issue now is that sometimes,&lt;/STRONG&gt; the interceptor doesn't work as expected. It's like Flume sends 2 messages, one contains the interceptor and the other one the message content.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Example:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I write 'hi you'&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;in the source file and press Enter then save the file&lt;/P&gt;&lt;P&gt;Flume detect the file change, then it sends the new line to Kafka producer.&lt;/P&gt;&lt;P&gt;My consumer get the following 2 line:&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;###HostName###000.00.0.000###&lt;BR /&gt;hi you&lt;/PRE&gt;&lt;P&gt;And the terminal scrolls to the the new message&amp;nbsp;content.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This case always happen when I type 'hi you' in the text file, and since I read from a log file, then it's not predictable when it happens.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Help and support will be much appreciated ^^&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 14:09:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86388#M45252</guid>
      <dc:creator>Matrix</dc:creator>
      <dc:date>2022-09-16T14:09:12Z</dc:date>
    </item>
    <item>
      <title>Re: Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86449#M45253</link>
      <description>Are you sure that the Kafka receives two messages? Or is it just that how the consumers displays the messages on the terminal? Kafka is key value based, so try to search in Flume logs if one or two messages were submitted to the topic.</description>
      <pubDate>Thu, 14 Feb 2019 07:29:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86449#M45253</guid>
      <dc:creator>Tomas79</dc:creator>
      <dc:date>2019-02-14T07:29:28Z</dc:date>
    </item>
    <item>
      <title>Re: Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86455#M45254</link>
      <description>Well, in the production case, I'm using Streamsets Pipeline with a Kafka Consumer element as source to stream.&lt;BR /&gt;&lt;BR /&gt;And I receive 2 messages, one is the interceptor and the other one is the message content without the interceptor. I will test it again now to make sure of that.&lt;BR /&gt;&lt;BR /&gt;Sometimes I also receive the full message with the interceptor appended to it. But I notice that there are garbage characters between the interceptor and the message which I have to parse using a regex expression.&lt;BR /&gt;&lt;BR /&gt;About the Flume side, no I'm not sure if flume actually sends 2 messages, can you tell me how can I check something like that? Is there some parameter in the logs configuration to set to show such a thing?</description>
      <pubDate>Thu, 14 Feb 2019 08:57:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86455#M45254</guid>
      <dc:creator>Matrix</dc:creator>
      <dc:date>2019-02-14T08:57:09Z</dc:date>
    </item>
    <item>
      <title>Re: Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86459#M45255</link>
      <description>&lt;P&gt;&amp;nbsp;So I just tested it with Streamsets,&lt;/P&gt;&lt;P&gt;I sent the message 'hmmmm' which Flume separates the interceptor from the message itself and I did receive 2 records:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hmmmm.JPG" style="width: 600px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/5317i08044C5BEB5975F3/image-size/large?v=v2&amp;amp;px=999" role="button" title="hmmmm.JPG" alt="hmmmm.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then I sent another message 'hi' and&amp;nbsp;receive it as 1 record:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hi.JPG" style="width: 600px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/5318i7C22C17EDFEADC0F/image-size/large?v=v2&amp;amp;px=999" role="button" title="hi.JPG" alt="hi.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So there is a problem for sure that I&amp;nbsp;didn't figure out yet.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The logs for Flume in both cases are shown below. I don't see anything weird there, what you guys think?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Logs for hmmmm&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;14 Feb 2019 12:20:57,991 INFO  [PollableSourceRunner-TaildirSource-source1] (org.apache.flume.source.taildir.ReliableTaildirEventReader.openFile:283)  - Opening file: /usr/software/flumData/flumeStressAndKafkaFailureTest.txt, inode: 1275070426, pos: 21
14 Feb 2019 12:21:19,593 INFO  [Log-BackgroundWorker-channel1] (org.apache.flume.channel.file.EventQueueBackingStoreFile.beginCheckpoint:227)  - Start checkpoint for /usr/software/flumData/checkpoint/checkpoint, elements to sync = 1
14 Feb 2019 12:21:19,596 INFO  [Log-BackgroundWorker-channel1] (org.apache.flume.channel.file.EventQueueBackingStoreFile.checkpoint:252)  - Updating checkpoint metadata: logWriteOrderID: 1550135209575, queueSize: 0, queueHead: 66366
14 Feb 2019 12:21:19,599 INFO  [Log-BackgroundWorker-channel1] (org.apache.flume.channel.file.Log.writeCheckpoint:1052)  - Updated checkpoint for file: /usr/software/flumData/data/log-19 position: 1497 logWriteOrderID: 1550135209575

&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Logs for hi&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;14 Feb 2019 12:22:49,600 INFO  [Log-BackgroundWorker-channel1] (org.apache.flume.channel.file.EventQueueBackingStoreFile.beginCheckpoint:227)  - Start checkpoint for /usr/software/flumData/checkpoint/checkpoint, elements to sync = 1
14 Feb 2019 12:22:49,607 INFO  [Log-BackgroundWorker-channel1] (org.apache.flume.channel.file.EventQueueBackingStoreFile.checkpoint:252)  - Updating checkpoint metadata: logWriteOrderID: 1550135209580, queueSize: 0, queueHead: 66366
14 Feb 2019 12:22:49,613 INFO  [Log-BackgroundWorker-channel1] (org.apache.flume.channel.&lt;BR /&gt;file.Log.writeCheckpoint:1052)  - Updated checkpoint for file: /usr/software/flumData/data/log-19 position: 1701 logWriteOrderID: 1550135209580&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 14 Feb 2019 09:27:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86459#M45255</guid>
      <dc:creator>Matrix</dc:creator>
      <dc:date>2019-02-14T09:27:52Z</dc:date>
    </item>
    <item>
      <title>Re: Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86461#M45256</link>
      <description>&lt;P&gt;Here is a screenshot when writing the 2 messages to a UTF8 file.&lt;/P&gt;&lt;P&gt;I used Notepad++ to show all symbols:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;You can see the garbage characters between the Interceptor and the message&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="RealOutput.JPG" style="width: 541px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/5319iA9D4D1F5016A3BE9/image-size/large?v=v2&amp;amp;px=999" role="button" title="RealOutput.JPG" alt="RealOutput.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Another thing I noticed, when I configured Kafka consumer data format to be text,&amp;nbsp;I receive 2 message (Interceptor + message content) but when&amp;nbsp;I configure it to be Binary I only receive 1 message.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So maybe the problem is with Kafka Consumer Data Format?&lt;/P&gt;</description>
      <pubDate>Thu, 14 Feb 2019 09:47:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86461#M45256</guid>
      <dc:creator>Matrix</dc:creator>
      <dc:date>2019-02-14T09:47:57Z</dc:date>
    </item>
    <item>
      <title>Re: Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86466#M45257</link>
      <description>&lt;P&gt;So, I did the following in Kafka Consumer element of Streamsets:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="KafkaConsumerConfig.JPG" style="width: 600px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/5320i2ED13FC28E4F2AF7/image-size/large?v=v2&amp;amp;px=999" role="button" title="KafkaConsumerConfig.JPG" alt="KafkaConsumerConfig.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And now it works fine!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I will keep testing it and come back here in a couple of days to mark the post as a solution.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I still notice that the file that I write into has more lines than the number of records that enter it. Even if I marked that &lt;STRONG&gt;Ignore Control Characters&amp;nbsp;&lt;/STRONG&gt;in the Kafka element, it still happen.&lt;/P&gt;</description>
      <pubDate>Thu, 14 Feb 2019 10:34:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86466#M45257</guid>
      <dc:creator>Matrix</dc:creator>
      <dc:date>2019-02-14T10:34:55Z</dc:date>
    </item>
    <item>
      <title>Re: Flume TAILDIR Source to Kafka Sink- Static Interceptor Issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86674#M45258</link>
      <description>&lt;P&gt;Back&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So far it's working fine.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I also found the problem with the writing in the file.&lt;/P&gt;&lt;P&gt;The garbage data between the interceptor and the message can contain literally anything. It contained \n which is the LF in Linux systems. This was causing the Kafka problem as well. Kafka see the \n and it&amp;nbsp;assumes that the message is 2 messages, not&amp;nbsp;1, that's why when I changed the delimiter to \r\n it assumed the message to be 1 message. That's a good conclusion I guess.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you want to write in a file or apply a regex on it, then just replace \n and \r with an empty string so you don't bother with those annoying control characters.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks to whoever wanted to help me.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 19 Feb 2019 13:17:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Flume-TAILDIR-Source-to-Kafka-Sink-Static-Interceptor-Issue/m-p/86674#M45258</guid>
      <dc:creator>Matrix</dc:creator>
      <dc:date>2019-02-19T13:17:01Z</dc:date>
    </item>
  </channel>
</rss>

