Support Questions
Find answers, ask questions, and share your expertise

Handling cast fields greater than 32K in NiFi using ExecuteSQL


Running NiFi - Version on HDF Version 2.0.1 we are running into the size limitation when handling very large blob/varchar fields from a DB2 select. Since about 10% of records fit this criteria I'm looking to handle these cases via a secondary processor or downstream service. I'm assuming this will need to be outside the NiFi supported processors. I'm interested in knowing what the community feels might be the best approach. Would Kafka, Sqoop, or Spark be a viable complement for this requirement? There are likely many valid approaches, so I'm looking for the solution that would involve no customization required as to be supported in our production environment. Thanks, ~Sean



I see the processor QueryDatabaseTable is provided to handled arbitrarily large result sets. Perhaps this is the way I should go. I see how to specify the table name, but I'm not sure how to introduce a 'select' statement. Can someone share how I might go about utilizing this processor?