Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

A problem with convert to Avro in QuaryDatabaseTable Nifi

New Contributor

Hello everyone,
My NIFI pipeline gets data from DB2 using QuaryDatabaseTable processor. When I launch it get exception (org.apache.nifi.processor.exception.ProcessException: Error during database query or conversation of record. I found out that the problem with some char in a table's field. I see this in the nifi log (nifi-app.log) and I see there that it falls with an exception java.nio.charset.UnmappableCharsetException: Input length = 1.
What I tried. I used another JDBC connection driver. I tried to set UTF8 in bootstrap.conf and nifi-env.sh (I found this tip here). It has not helped. 
Could you help me, please?

 

4 REPLIES 4

New Contributor

Sorry, I forgot to specify the NiFi version. We use 1.11.4

Contributor

Seems like db2 is using some data type which is not recognizable to Avro. You can try to Disable Avro logical types to false in QueryDatabaseTable and then parse the data correctly within the flow.

 

If you find the answer helpful please accept this as a solution.

New Contributor

Thank you for your answer.
Yes, sure I tried this option. I know exactly which field is causing the problem. I even know the symbol. The field has char type. I tried to apply UTF-8, but it didn't help

Contributor

Is it possible to show what the character is exactly? With logical type string it should accept any character as long as is it not a invalid or garbage value....

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.