I have a flow which loads a CSV file, transforms it to a json format with one data row. This gives me a json array with one entry being one data row.
Right now I split the flowfile so I end up with one flowfile per data row and then load these to a postgres via a PutDatabaseRecord processor. All of this works fine, but the splitting feels a bit inefficient and my flows end up with millions of flowfiles flying around.
I tried giving the PutDatabaseRecord processor a json array, but that just threw an error.
Is there a way to have multiple datarows in a flowfile and load these to the database?