I have csv files and I want to move the content of files along with its meta data (File name, source (To be hard coded), control number (Part of file name - to be extracted from file name itself) thru NiFi. So here is the sample File name and layout -
File name - 12345_user_data.csv (control_number_user_data.csv)
source - Newyork
CSV File Content/columns -
Fields - abc1, abc2, abc3, abc4
values - 1,2,3,4
Postgres Database table layout
Table name - User_Education
fields name -
control_number, file_name, source, abc1, abc2, abc3, abc4
12345, 12345_user_data.csv, Newyork, 1,2,3,4
I am planning to use below processors -
But I am not sure how to combine the actual content with the meta data to load into one single table. Please help
There are many ways to do this. I have added a template to my NiFI templates for you. This way takes a csv Input, splits the lines, extracts two columns, builds an insert statement, and executes that statement (requires database connection pool controller service).
The only real tricky part to this is the regex for mapping the columns in ExtractText Processor.
Once you are able to parse the csv to attributes, adding more attributes for metadata, and adding those details to the insert query should be very easy.
Hope this helps get you started. Additionally, if you search here, you will find loads of posts with all the other suggested methods for processing csv to sql.
Looking over your most recent post it appears that @stevenmatison solved your issue. Once you've had a chance to try out the files he's provided, can you confirm by using the Accept as Solution button which can be found at the bottom of his reply so it can be of assistance to others?