Created 09-29-2016 11:13 PM
testingsqlflow.jpgI have a query that I execute using ExecuteSQL processor that returns 900 rows of data. It Output is in AVRO format which I convert to JSON (one JSON document per row of data) using ConvertAVROtoJSON processor. The output is then sent to a PutMongo processor. But, only the first row from this processor is written to MongoDB. The others are not written and I get get no errors. Any suggestion on how to make sure all rows are written into MongoDB?
I have attached a diagram of the Nifi flow.
Thanks!
VJ
Created 09-30-2016 02:03 PM
Are you sure that only the first record was written? The NiFi doc says ConvertAvrotoJson converts to a single JSON object.
Created 09-30-2016 02:03 PM
Are you sure that only the first record was written? The NiFi doc says ConvertAvrotoJson converts to a single JSON object.
Created 09-30-2016 02:34 PM
Yes, the query returns over 1300 rows. The ConverttoJSON writes both to MongoDB and to a file (using PutFile). The file gets all the rows - one json document per line. But, MongoDB gets only one record which is the first one.
Created 09-30-2016 04:09 PM
Try replacing your ConvertAvrotoJSON with a SplitAvro processor. So try a flow like this:
ExecuteSQL > SplitAvro > ConvertAvrotoJSON > PutMongo
Created 09-30-2016 06:08 PM
That worked great! Thank you very much! My collection now has over 1300 records as expected.
Created 09-30-2016 06:40 PM
Sweet. Glad I could help.