Member since
10-15-2018
6
Posts
0
Kudos Received
0
Solutions
11-01-2018
01:30 PM
@Shu, can you please help me resolve it? I greately appreciate your time. Thank you.
... View more
10-22-2018
02:38 PM
@Shu, Thank you for taking time to reply. I am using PublishKafkaRecord processor with CSVReader and AvroRecordSetWriter services. for two reasons, I have to convert my csv reocrds to avro, 2) to check if Schema of incoming message matches to my avro schema.I don't want to send any incoming messages to kafka, but only the once which are successfully converted to my schema. Please let me know, if I can provide any other info. Thank you.
... View more
10-18-2018
08:01 PM
@Shu, Thank you so much for responding. Attaching flow and Merge content configuration. By the way merge content create more than one file and adds header on each record. I am not sure, what is the best setting for merging. flow1.pngflow2.pngflow3.png Thank you.
... View more
10-17-2018
02:22 PM
@Shu, I tried your suggestion but my output file has header on each record. Can you please guide me, how to have a single header for the merged file? I appreciate your time. Thank you.
... View more
10-16-2018
01:41 PM
@Shu, Thank you for replying. I will try your suggestion and let you know. My intention for splitting records was to capture individual records, that failed to publish to kafka. Right now, if any record fails, entire file fails, and I don't know how to capture only failed one and continue sending other records. Can you please suggest me better approach? Thank you.
... View more
10-15-2018
06:59 PM
flow1.pngHello, I am new to Nifi and need some help with Merging the flow files. Attaching my flow, flow1.png and merge content settings and error mergeprocesorsetting.png. error.png I am trying to publish csv records to Kafka using Publish Kafka record Processor. The file may contain thousand to million records. Before sending to Kafka I am splitting my file and replacing some text.So I am trying to send individual record to Kafka. After successfully publishing them, I want to put all the individual records in a single header csv file. I tried Merge Processor with Defragment strategy, It works fine for 10000 records, but when my file is bigger It gives error, can not defragment. I tried to set Maximum number of entry to no value but it takes 1000 as default. not sure, how to set it blank. I also tried Bin packing strategy, but it creates many files and merges with many header for each record. Can anyone guide me how to merge, flowfiles to single file. Please help me fix this error.
... View more
Labels:
- Labels:
-
Apache NiFi