- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
consuming data from multiple kafka topic and publishing to multiple topics - one to one mapping
- Labels:
-
Apache NiFi
Created ‎11-14-2023 10:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I want to create a flow In NIFI where i want to consumer data from 3 Kafka topics and produce that data into 3 different Kafka topics . each topic should produce data into unique topic . for example -
kafka topics 1 --> produce to topic A
kafka topics 2 --> produce to topic B
kafka topics 3 --> produce to topic C
i want to use only one Consumer processor and one producer processor . Right now i am using 3 produce Kafka processor .
Can anyone suggest better approach to do so and in more optimized way.
Help me to reduce three publish processors to 1 , but it should be still able to consume from multiple topics and produce to multiple topics but dynamically , like topic1 produces data to topic A only and topic 2 to topic B and so on .
i tried using expression language . kafka.topic is the attribute which contains consumer topics. so i added if else condition for different consumer topic but the output was not the producer topic but whole string mentioned below
in update attribute added a property customerKafkaTopic=${kafka.topic:contains('alerts'):ifElse('${kafkaTopic1}',${kafka.topic:contains('events'):ifElse('${kafkaTopic2}',${kafka.topic:contains('ack'):ifElse('${kafkatopic3}','Not_found')}))}}
and passed this property customerKafkaTopic in publish kafka but it is not working .
Please help with a working approach.
Created ‎11-21-2023 01:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What's wrong with 1:1 relationship? If you're concerned with performance, then leveraging "Message Demarcator" for the Consume and Publish will provide the best throughput.
Created ‎11-21-2023 01:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What's wrong with 1:1 relationship? If you're concerned with performance, then leveraging "Message Demarcator" for the Consume and Publish will provide the best throughput.
