Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

QueryDatabaseProcessor supports maximum how many columns in a single run

New Contributor



I have a table with 526 columns. QueryDataBase processor is not able to fetch all the rows and it is failing. I am stuck on how to proceed further or which processor to use to fetch the rows. Can someone please guide me. 


Super Collaborator

If you are able to ingest a limited number of rows, the first thing to look at is the Fetch Size property in your processor. 


This should help you to ingest the table in chunks of reasonable size. 



If this answers the question, consider marking this as an answer 

- Dennis Jaheruddin

If this answer helped, please mark it as 'solved' and/or if it is valuable for future readers please apply 'kudos'.

New Contributor

Hi Dennis,


Thanks for the response. I already implemented the fetch size property as 20000 and Max Rows Per flow file is 20000. My question is is NIFI capable of handling 526 columns? 

The reason why I am asking this is, I can see that nifi is pulling the data, but the performance is not upto the mark. 

I would like to know is there a better approach, other than the splitting and re-joining the columns. I think if I go with around 200 columns, NiFi will be faster. 




Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.