Member since
11-18-2024
5
Posts
2
Kudos Received
0
Solutions
12-20-2024
07:26 AM
@SAMSAL , This is indeed more efficient. Thanks for letting me know.
... View more
12-19-2024
10:14 PM
1 Kudo
Hi @SAMSAL , Thanks this did work.
... View more
12-18-2024
09:16 PM
Hi @SAMSAL , I'm using the CSV readers and writers in Query Record processor and CSVtoJSON Record Converter processor. I am expecting around 10 types of CSV files(I mean to say, files with different columns). But this number might increase in the future upto 50. Even though I know what will these 50 files exactly contain, I'll still have to write all the 50 schemas in a schema registry. Which is complex to maintain and work with. So I thought of this method in which I'll append all the values in a csv file with double quotes such that the reader and writer will consider it as a string. But it still continues to read it as an integer Input File: col1,col2,col3,col4,col5 999,C10,100,010,0 999,C06,10,010,0 This is what the CSV writer is giving: avro.schema {"type":"record","name":"nifiRecord","namespace":"org.apache.nifi","fields":[{"name":"col1","type":["int","null"]},{"name":"col2","type":["int","null"]},{"name":"col3","type":["string","null"]},{"name":"col4","type":["int","null"]},{"name":"col5","type":["int","null"]}]}
... View more
12-18-2024
01:53 AM
1 Kudo
I am using CSVReader and CSVRecordSetWriter using the Infer schema setting. But when I have values such as "030", even though the all the values has double quotes enclosed. When I write the schema into avro.schema, I see it is considered as type 'int'. But I want to treat it as string. Because of this our output after the processor looks like 30 and the first 0 is omitted. I want to use Infer schema property only because, I want to read the csv files dynamically without hardcoding the schema and use that.
... View more
Labels:
- Labels:
-
Apache NiFi
11-18-2024
09:02 PM
2024-11-19 04:52:41,378 WARN [Monitor Processor Lifecycle Thread-1] o.a.n.controller.StandardProcessorNode Timed out while waiting for OnScheduled of ExecuteScript[id=acb441ba-c36b-1fdd-53f2-3a4821d43833] to finish. An attempt is made to cancel the task via Thread.interrupt(). However it does not guarantee that the task will be canceled since the code inside current OnScheduled operation may have been written to ignore interrupts which may result in a runaway thread. This could lead to more issues, eventually requiring NiFi to be restarted. This is usually a bug in the target Processor 'ExecuteScript[id=acb441ba-c36b-1fdd-53f2-3a4821d43833]' that needs to be documented, reported and eventually fixed.
... View more
Labels:
- Labels:
-
Apache NiFi