My requirement is to load a csv file into a database using NiFi. The current approach (which is inefficient) is SplitText--> ExtractText-->UpdateAttribute-->ReplaceText-->PutSQL. I am looking for a solution that lets me do a batch commit and not individual INSERTs for each FlowFile.
I have seen a couple of responses online like - use PutDatabaseRecord Processor or ConvertJSONToSQL Processor.
But, I have to invoke UDFs in the Insert query to transform the data before inserting into the table. I do not think I will be able to use the above mentioned approaches if UDFs are involved.
Is there a way to Begin and commit the Insert Statements before passing them to PutSQL ? Can we invoke any SQL utility to achieve that? I would really appreciate your help.