Member since
11-03-2023
26
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1481 | 11-08-2023 09:29 AM |
11-28-2023
07:28 PM
Thanks @SAMSAL its working , but iam not bale to understand the settings . My requirement is , my max queue size will be 250 requests and i wanted them to get retried till 24 hours , every hour (every 1 hours till 24 hours) . once a flow file has completed 24 hours in queue i want it to get expired and i will drop it , by connecting it to a log message processor . can you please help on these settings ????
... View more
11-28-2023
05:45 AM
kafka\.topic , i tried adding this , but it did not work . I checked kafka headers too there also nothing being printed in haeder.
... View more
11-28-2023
04:33 AM
Hi , I have a publish kafka processor where iam having a self loop and flowfile expiration time for that loop is 1 hours , if some file is there for more then 1 hour it will get removed. i want to print log message (e.g. dropping the flow file) after every flow file is removed from queue after getting expired. This is the flow -
... View more
Labels:
- Labels:
-
Apache NiFi
11-26-2023
08:39 AM
kafka.topic is a key , it will have value of a kafka topic name e.g abc_xyz . this value will keep on depending from where iam consuming the meassge , from which topic . my requirement is to send the topic from where i consumed as header to the topic where iam publishing it . how can i do that ??
... View more
11-24-2023
01:04 AM
i want to send value of kafka.topic as header i.e (the topic from where i consumed message) to publish topic (another topic) by using publish processor. But iam getting below error in publish processor-
... View more
Labels:
- Labels:
-
Apache NiFi
11-20-2023
12:10 AM
Hi , In Apache NIFI I am consuming data from one Kafka topic and publishing that data to customers Kafka topic . For this i am using ConsumeKafka and PublishKafka processor respectively. If outgoing dataflow is idle for more than 5 minutes, the Kafka connection with customer should be terminated and reconnected when there is any new messages consumed . how to do this with publish kafka processor ?? how to terminate and start the connection??
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
11-16-2023
12:33 AM
Hi , In Apache NIFI I am consuming data from one Kafka topic and publishing that data to customer Kafka topic . For this iam using ConsumeKafka and PublishKafka processor respectively. If outgoing dataflow is idle for more than 5 minutes, the Kafka connection with customer should be terminated and reconnected when there is any new messages consumed . how to do this with publish kafka processor ??
... View more
Labels:
- Labels:
-
Apache NiFi
11-15-2023
02:08 AM
iam trying to publish data into multiple kafka topics with some conditions but instead of returning value of the expression , whole value passed for the property is coming. customerTopic should return me the value of the expression but its returning the expression itself attaching the images of the flow.
... View more
Labels:
- Labels:
-
Apache NiFi
11-14-2023
10:57 PM
You would need a method outside of the consume/produce that handles logic for which consume topic maps to which produce topic. can you please elaborate it more , what method and what logic need to be there
... View more
11-14-2023
10:56 PM
I want to create a flow In NIFI where i want to consumer data from 3 Kafka topics and produce that data into 3 different Kafka topics . each topic should produce data into unique topic . for example - kafka topics 1 --> produce to topic A kafka topics 2 --> produce to topic B kafka topics 3 --> produce to topic C i want to use only one Consumer processor and one producer processor . Right now i am using 3 produce Kafka processor . Can anyone suggest better approach to do so and in more optimized way. Help me to reduce three publish processors to 1 , but it should be still able to consume from multiple topics and produce to multiple topics but dynamically , like topic1 produces data to topic A only and topic 2 to topic B and so on . i tried using expression language . kafka.topic is the attribute which contains consumer topics. so i added if else condition for different consumer topic but the output was not the producer topic but whole string mentioned below in update attribute added a property customerKafkaTopic=${kafka.topic:contains('alerts'):ifElse('${kafkaTopic1}',${kafka.topic:contains('events'):ifElse('${kafkaTopic2}',${kafka.topic:contains('ack'):ifElse('${kafkatopic3}','Not_found')}))}} and passed this property customerKafkaTopic in publish kafka but it is not working . Please help with a working approach.
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
-
- 1
- 2
- Next »