Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Nifi: ExecuteSQL returns multiple rows of data, but only one entry is written to MongoDB database

testingsqlflow.jpgI have a query that I execute using ExecuteSQL processor that returns 900 rows of data. It Output is in AVRO format which I convert to JSON (one JSON document per row of data) using ConvertAVROtoJSON processor. The output is then sent to a PutMongo processor. But, only the first row from this processor is written to MongoDB. The others are not written and I get get no errors. Any suggestion on how to make sure all rows are written into MongoDB?

I have attached a diagram of the Nifi flow.

Thanks!

VJ

1 ACCEPTED SOLUTION

Expert Contributor

Are you sure that only the first record was written? The NiFi doc says ConvertAvrotoJson converts to a single JSON object.

View solution in original post

5 REPLIES 5

Expert Contributor

Are you sure that only the first record was written? The NiFi doc says ConvertAvrotoJson converts to a single JSON object.

Yes, the query returns over 1300 rows. The ConverttoJSON writes both to MongoDB and to a file (using PutFile). The file gets all the rows - one json document per line. But, MongoDB gets only one record which is the first one.

Expert Contributor

Try replacing your ConvertAvrotoJSON with a SplitAvro processor. So try a flow like this:

ExecuteSQL > SplitAvro > ConvertAvrotoJSON > PutMongo

That worked great! Thank you very much! My collection now has over 1300 records as expected.

Expert Contributor

Sweet. Glad I could help.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.