Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cannot convert CHOICE, type must be explicit

avatar
Explorer

Hi all
I have the following Nifi template which ingests data from a csv file and put them in a database.

 

nifi-1.png

The template is available here
https://github.com/cetic/fadi/blob/master/examples/basic/basic_example_final_template.xml

The template is composed of two processes InvokeHTTP and PutDatabaseRecord

 

InvokeHTTPInvokeHTTPPutDatabasePutDatabase

It has also two controller services : CSV Reader and DBCPCOnnectionPool

 

nifi-4.pngnifi-5.png

The complete explanation of this small example is available here
https://github.com/cetic/fadi/blob/master/USERGUIDE.md#3-ingest-measurements


The csv file is here
https://raw.githubusercontent.com/cetic/fadi/master/examples/basic/sample_data.csv

 

The table creation is:

CREATE TABLE example_basic (
measure_ts TIMESTAMP NOT NULL,
temperature FLOAT (50)
);

 

Until Apache Nifi 1.9.2 everything works are expected!
Since Apache Nifi 10.0.0, I have an Error in the PutDatabaseRecord processor

"org.apache.nifi.serialization.record.util.IllegalTypeConversionException: Cannot convert CHOICE, type must be explicit"

nifi-6.png

It seems to be related to this https://github.com/apache/nifi/blob/master/nifi-commons/nifi-record/src/main/java/org/apache/nifi/se...

 

Any idea how can I edit the template to fix this error?

 

1 ACCEPTED SOLUTION

avatar
Explorer

The solution is to change the schema access strategy to Use String Fields From Header

 

image.png

View solution in original post

2 REPLIES 2

avatar
Explorer

I tried to update the schema text of CSV reader with the following content:

{
"type" : "record",
"name" : "userInfo",
"namespace" : "my.example",
"fields" : [{"name" : "measure_ts", "type" : {
"type" : "long",
"logicalType" : "timestamp-millis"
}},{"name" : "temperature", "type" : "long"}]
}

but always the same error ! 

avatar
Explorer

The solution is to change the schema access strategy to Use String Fields From Header

 

image.png