Support Questions

Find answers, ask questions, and share your expertise

Failed to parse message from Kafka nifi.serialization.MalformedRecordException

avatar
Expert Contributor

HI All,

 

Im getting below error  after new fileds are being added to the payload from Kafka
OpenTX, datatype: double precision
charges, Datatype: string

note:before these 2 fields added to the payload its working fine with existing json reader and writer

Error:
05:25:40 UTCERRORConsumeKafkaRecord_2_6[id=8f8pq352c-0rcb] Failed to parse message from Kafka using the configured Record Reader. Will route message as its own FlowFile to the 'parse.failure' relationship: org.apache.nifi.serialization.MalformedRecordException: Successfully parsed a JSON object from input but failed to convert into a Record object with the given schema
- Caused by: org.apache.nifi.serialization.record.util.IllegalTypeConversionException: Cannot convert value [[Ljava.lang.Object;@6b9b8a0c] of type class [Ljava.lang.Object; for field invoiceLineList.ecoTaxValues to any of the following available Sub-Types for a Choice: [ARRAY[STRING], ARRAY[DOUBLE]]

input payload:

{
"Identifier":"123456",
"creationDate":"2024-11-22T12:22:01.331894Z",
"docType":"01",
"currency":"INR",
"taxCode": [null,null,null,"04",null,null,null,null,null],
"invoiceLineList": [
{
"line":1,
"charges":["009",null,null],
"OpenTX": [0.3,0.9,null,null]
},
{
"line":2,
"charges":["002",null,null],
"OpenTX": [0.1,1.9,null,null]
}
]
}

below are the json reader and writer configs.
JSONTreeReader:

PradNiFi1236_0-1732645858110.png

JSONrecordSETWriter

PradNiFi1236_1-1732645914350.png


i've tried changing reader to AVRO reader using "Confluent Content-Encoded Schema Reference" in the Avro Reader "Schema Access Strategy" field and keeping JsonRecordsetwriter same as above but still its not working,

Can anyone help me here?
@SAMSAL 

 

thanks!!

6 REPLIES 6

avatar
Super Guru

Hi @PradNiFi1236 ,

How are you adding the new fields? You Json appears to be invalid as provided.

avatar
Expert Contributor

@SAMSAL , sorry for the earlier input , ive udpated now. this payload coming from Kafka by the way.

 

 

avatar
Super Guru

Can you provide more information on your dataflow ? let's say you are using GenerateFlowFile to create the json Kafka output, what happens next? How are you enriching the data and what kind of processor where you are using the json reader\writer service that is causing the error? I need to see the full picture here because When I use same  json you provided in GenerateFlowFile processor and then passed it to QueryRecord with the same Json reader\writer service configuration, it seems to be working!

avatar
Expert Contributor

@SAMSAL , yeah from generate flow file there is no issue.its processing fine using QueryRecord.

here the file coming From Kafkatopic , so we are using consumeKafka as top processor, there itself we are having parsefailures. there is no schema from kafka also we are inferring that schema whatever file coming,

if you read the error 

PradNiFi1236_0-1732717247389.png

its taking this value ecoTaxValues , in payload i mentioned as 
"OpenTX": [0.1,1.9,null,null]

values considering as double and null values considering as string, that's causing issue, 

is there any way from json/avro reader from consumekafka we can skip this or ignoring these nulls, other than writing avro schema. why because we have so many fields in actual payload thats why, and if any new field addition from source i need to adpat the field again at our schema, so to avoid that i'm trying, Could you please help

 

thanks!!

 

avatar
Super Guru

Hi,

It still not clear to me what is exactly happening and where. The error message states a field called ecoTaxValues  which doesnt seem to exist in the provided input. You also mentioned that you are using ConsumeKafka and getting an error there through reader\write while the consumeKafka processor doesnt take any reader\writer service. The consumeKafkaRecord does....is that what you are using? Please be specific when describing the problem as much as you can. If you cant share the information for security reason then I would recommend you try to reproduce using sample data and dataflow to make it easier to isolate the error. Also please share screenshot\accurate description of the dataflow since the inception of the input and share the processor configurations as well as any services that are being used.

avatar
Expert Contributor

@SAMSAL , Thanks for you reply, infact i've asked from Kafka side itself not to send Null values, that sorted issue.