Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Handling cast fields greater than 32K in NiFi using ExecuteSQL

Handling cast fields greater than 32K in NiFi using ExecuteSQL

Contributor

Running NiFi - Version 1.0.0.2.0.1.0-12 on HDF Version 2.0.1 we are running into the size limitation when handling very large blob/varchar fields from a DB2 select. Since about 10% of records fit this criteria I'm looking to handle these cases via a secondary processor or downstream service. I'm assuming this will need to be outside the NiFi supported processors. I'm interested in knowing what the community feels might be the best approach. Would Kafka, Sqoop, or Spark be a viable complement for this requirement? There are likely many valid approaches, so I'm looking for the solution that would involve no customization required as to be supported in our production environment. Thanks, ~Sean

1 REPLY 1

Re: Handling cast fields greater than 32K in NiFi using ExecuteSQL

Contributor

I see the processor QueryDatabaseTable is provided to handled arbitrarily large result sets. Perhaps this is the way I should go. I see how to specify the table name, but I'm not sure how to introduce a 'select' statement. Can someone share how I might go about utilizing this processor?