Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

java.lang.ArrayIndexOutOfBoundsException while reading avroReader


java.lang.ArrayIndexOutOfBoundsException while reading avroReader

New Contributor

I am using the CDC (change data capture) tool + confluent kafka 3.2.1 + schema registry on the event producer side. Using Nifi ConsumeKafkaRecord_0_10 processor to consume the avro mesages on kafka in order to read the avro messages i created avroReader which has the avro schema registry plugged in but when it is reading the messages off the kafka i am getting below exception.

when i just use the confluent provider avro deserializer to read the messages off the topic using below command i dont see any exception as my first doubt was the messages on topic are corrupt but that is not the case.

bin/kafka-avro-console-consumer --bootstrap-server localhost:9092 --property schema.registry.url="http://localhost:8081" --topic kaf_poc.poc_kafka.sourcedb.dbas.emp_nopk --from-beginning

2017-07-11 01:00:49,939 ERROR [Timer-Driven Process Thread-3] o.a.n.p.k.pubsub.ConsumeKafkaRecord_0_10 ConsumeKafkaRecord_0_10[id=2f997e79-015d-1000-398e-65916520b3f7] Failed to parse message from Kafka using the configured Record Reader. Will route message as its own FlowFile to the 'parse.failure' relationship: java.lang.ArrayIndexOutOfBoundsException: 32 java.lang.ArrayIndexOutOfBoundsException: 32 at$Alternative.getSymbol( at at at at org.apache.avro.generic.GenericDatumReader.readWithoutConversion( at at org.apache.avro.generic.GenericDatumReader.readField( at org.apache.avro.generic.GenericDatumReader.readRecord( at org.apache.avro.generic.GenericDatumReader.readWithoutConversion( at at at org.apache.nifi.avro.AvroReaderWithExplicitSchema.nextAvroRecord( at org.apache.nifi.avro.AvroRecordReader.nextRecord( at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData( at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.lambda$processRecords$8( at java.util.HashMap$KeySpliterator.forEachRemaining( at$Head.forEach( at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.processRecords( at org.apache.nifi.processors.kafka.pubsub.ConsumerLease.poll( at org.apache.nifi.processors.kafka.pubsub.ConsumeKafkaRecord_0_10.onTrigger( at org.apache.nifi.processor.AbstractProcessor.onTrigger( at org.apache.nifi.controller.StandardProcessorNode.onTrigger( at at at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$ at java.util.concurrent.Executors$ at java.util.concurrent.FutureTask.runAndReset( at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301( at java.util.concurrent.ScheduledThreadPoolExecutor$ at java.util.concurrent.ThreadPoolExecutor.runWorker( at java.util.concurrent.ThreadPoolExecutor$ at


Re: java.lang.ArrayIndexOutOfBoundsException while reading avroReader

You currently can't use ConsumeKafkaRecord_0_10 to consume Confluent Avro. The Confluent Avro is a special Avro format that contains additional information and cannot be read by regular Avro readers.

On the master branch of Apache NiFi there is support for integration with Confluent, there will be a new option in the "Schema Access Strategy" for "Confluent Content-Encoded Schema Reference" which will allow it to read the Confluent Avro.

Don't have an account?
Coming from Hortonworks? Activate your account here