Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Nifi: ExecuteSQL returns multiple rows of data, but only one entry is written to MongoDB database

avatar
New Member

testingsqlflow.jpgI have a query that I execute using ExecuteSQL processor that returns 900 rows of data. It Output is in AVRO format which I convert to JSON (one JSON document per row of data) using ConvertAVROtoJSON processor. The output is then sent to a PutMongo processor. But, only the first row from this processor is written to MongoDB. The others are not written and I get get no errors. Any suggestion on how to make sure all rows are written into MongoDB?

I have attached a diagram of the Nifi flow.

Thanks!

VJ

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Are you sure that only the first record was written? The NiFi doc says ConvertAvrotoJson converts to a single JSON object.

View solution in original post

5 REPLIES 5

avatar
Super Collaborator

Are you sure that only the first record was written? The NiFi doc says ConvertAvrotoJson converts to a single JSON object.

avatar
New Member

Yes, the query returns over 1300 rows. The ConverttoJSON writes both to MongoDB and to a file (using PutFile). The file gets all the rows - one json document per line. But, MongoDB gets only one record which is the first one.

avatar
Super Collaborator

Try replacing your ConvertAvrotoJSON with a SplitAvro processor. So try a flow like this:

ExecuteSQL > SplitAvro > ConvertAvrotoJSON > PutMongo

avatar
New Member

That worked great! Thank you very much! My collection now has over 1300 records as expected.

avatar
Super Collaborator

Sweet. Glad I could help.