Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Schema Registry metadata error from NiFi: Unrecognized field "validationLevel" .. not marked as ignorable

avatar
Guru

I have installed HDFS 3.0.1. I am using a PublishKafkaRecord processor in NiFI to access a schema via HortonworksSchemaRegistry.

28392-screen-shot-2017-08-15-at-50646-pm.png

28387-screen-shot-2017-08-15-at-32214-pm.png

28389-screen-shot-2017-08-15-at-34548-pm.png

I am getting the below error from PutKafkaRecord.

Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "validationLevel" (class com.hortonworks.registries.schemaregistry.SchemaMetadata), not marked as ignorable (6 known properties: "compatibility", "type", "name", "description", "evolve", "schemaGroup"])
 at [Source: {"schemaMetadata":{"type":"avro","schemaGroup":"test","name":"simple","description":"simple","compatibility":"BACKWARD","validationLevel":"ALL","evolve":true},"id":3,"timestamp":1502815970781}; line: 1, column: 140] (through reference chain: com.hortonworks.registries.schemaregistry.SchemaMetadataInfo["schemaMetadata"]->com.hortonworks.registries.schemaregistry.SchemaMetadata["validationLevel"])

28391-screen-shot-2017-08-15-at-35847-pm.png

How do I get NiFi to ignore the validationLevel attribute for the schema and not throw this error?

1 ACCEPTED SOLUTION

avatar
Master Guru

This is a bug in the HDF 3.0.1 release related to the integration of NiFi and Schema Registry.

It will be fixed in a future release. For now you could stick with 3.0.0 if you need this to work.

View solution in original post

9 REPLIES 9

avatar
Master Guru

can you post the schema? can you post an example CSV? are you running kafka locally? i see a localhost link there.

You must use the same version of NiFi and Schema Registry they must be from the same HDF 3.x release. So you can't download nifi.apache.org NiFi and try that.

avatar
Rising Star

@Greg Keys

What is your schema access strategy on the service controller for your record writer (AvroRecordSetWriter) ?

avatar
Master Guru

You have to set the schema name as an attribute

make sure you create your kafka topic

Make sure you setup csv reader and avro writer correctly

27489-updateattribue.png

Set the schema, setting content-type doesn't hurt

27490-schema.png

Add schema to hortonworks schema registry, match those names simple

27491-publickafkarecord-0-10.png

set the reader and writer, server and topic

27492-csvreader.png

configure the reader

27493-avrorecordsetwriter.png

configure the writer

create a topic before hand and then you can consume it after it get's pushed through NiFi

./kafka-console-consumer.sh --zookeeper princeton10.field.hortonworks.com:2181 --topic simple


Using the ConsoleConsumer with old consumer is deprecated and will
be removed in a future major release. Consider using the new consumer by
passing [bootstrap-server] instead of [zookeeper].{metadata.broker.list=princeton10.field.hortonworks.com:6667,
request.timeout.ms=30000, client.id=console-consumer-7447,
security.protocol=PLAINTEXT}aabaabb

avatar
Master Guru

This is a bug in the HDF 3.0.1 release related to the integration of NiFi and Schema Registry.

It will be fixed in a future release. For now you could stick with 3.0.0 if you need this to work.

avatar
Guru

Bug is fixed in 3.0.1.1 patch and issue is resolved. Release notes as follows:

http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/HDF3/HDF-3.0.1.1/bk_release-notes/content/c...

avatar
New Contributor
@Bryan Bende

i have install HDF 3.0.0 and 3.0.1.1 then i coudn't choose the "storm service" (it could be replaced by null service) when i install nifi, SAM, Registry services.

so i can't complete the deployment

38481-capture.png

avatar
New Contributor

@Bryan Bende & @Greg Keys, I am running version 3.0.1.1 of HDF and I still receive this error when trying to run the streaming analytics demo. Is there a new version of the simulator available or additional steps required beyond running version 3.0.1.1?

avatar
New Contributor

@Bryan Bende & @Greg Keys the error seems to persist with HDF 3.0.2. Are there any news on a workaround or a fix?

avatar
Contributor

Hi all.

The main issue from what i can gather is that the validationlevel column in the schema meta data table is a recent addition. -- added in 0.3.1. I suspect the NIFI processors shipped with HDF 3.0.1 are not correctly configured to deal with it

I first ran into the same error while attempting the streaming analytics demo https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.1.1/bk_getting-started-with-stream-analytics/...

In the demo we are asked to run a data generator stream-simulator-jar-with-dependencies.jar to generate data and publish to predefined kafka topics.

nifi is tasked to consume form these topics before enrichment and publishing to new kafka topics.

I first ran into the registry error discussed above while running the stream-simulator-jar-with-dependencies.jar. To resolve i had to extract the jar

modify the SchemaMetadata.class as follows

package com.hortonworks.registries.schemaregistry;


import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.google.common.base.Preconditions;
import com.hortonworks.registries.schemaregistry.SchemaCompatibility;
import java.io.Serializable;


@JsonIgnoreProperties(
    ignoreUnknown = true
)

recompile the SchemaMetadata.class and repackage the jar.

The stream-simulator-jar-with-dependencies.jar should now write to kafka as desired.

So now my data generator works..

EventSimulator-akka.actor.default-dispatcher-2 INFO  collectors.KafkaEventSerializedWithRegistryCollector - {"eventTime": "2017-08-17 09:13:10.658", "eventSource": "truck_geo_event", "truckId": 24, "driverId": 18, "driverName": "Tom McCuch", "routeId": 14, "route": "Springfield to KC Via Hanibal", "eventType": "Normal", "latitude": 39.78, "longitude": -89.66, "correlationId": 1}
EventSimulator-akka.actor.default-dispatcher-2 INFO  collectors.KafkaEventSerializedWithRegistryCollector - {"eventTime": "2017-08-17 09:13:10.658", "eventSource": "truck_geo_event", "truckId": 24, "driverId": 18, "driverName": "Tom McCuch", "routeId": 14, "route": "Springfield to KC Via Hanibal", "eventType": "Normal", "latitude": 39.78, "longitude": -89.66, "correlationId": 1}
EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck geo event[org.apache.avro.generic.GenericData$Record@1434ef29[schema={"type":"record","name":"truckgeoevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"eventType","type":"string"},{"name":"latitude","type":"double"},{"name":"longitude","type":"double"},{"name":"correlationId","type":"long"}]},values={2017-08-17 09:13:10.658,truck_geo_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,Normal,39.78,-89.66,1}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal]
EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck geo event[org.apache.avro.generic.GenericData$Record@1434ef29[schema={"type":"record","name":"truckgeoevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"eventType","type":"string"},{"name":"latitude","type":"double"},{"name":"longitude","type":"double"},{"name":"correlationId","type":"long"}]},values={2017-08-17 09:13:10.658,truck_geo_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,Normal,39.78,-89.66,1}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal]
EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck speed event[org.apache.avro.generic.GenericData$Record@5ea9a8b8[schema={"type":"record","name":"truckspeedevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"speed","type":"int"}]},values={2017-08-17 09:13:10.66,truck_speed_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,59}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal]
EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck speed event[org.apache.avro.generic.GenericData$Record@5ea9a8b8[schema={"type":"record","name":"truckspeedevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"speed","type":"int"}]},values={2017-08-17 09:13:10.66,truck_speed_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,59}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal]


.However attempting to consume from the kafka topic with NIFI is still trowing errors

aused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "validationLevel" (class com.hortonworks.registries.schemaregistry.SchemaMetadata), not marked as ignorable (6 known properties: "compatibility", "type", "name", "description", "evolve", "schemaGroup"])
 at [Source: {"schemaMetadata":{"type":"avro","schemaGroup":"truck-sensors-kafka","name":"raw-truck_events_avro","description":"Raw Geo events from trucks in Kafka Topic","compatibility":"BACKWARD","validationLevel":"ALL","evolve":true},"id":1,"timestamp":1502972321149}; line: 1, column: 205] (through reference chain: com.hortonworks.registries.schemaregistry.SchemaMetadataInfo["schemaMetadata"]->com.hortonworks.registries.schemaregistry.SchemaMetadata["validationLevel"])
        at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:51)
        at com.fasterxml.jackson.databind.DeserializationContext.reportUnknownProperty(DeserializationContext.java:836)
        at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1045)
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1352)
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1330)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:262)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
        at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
        at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:101)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:256)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
        at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3702)
        at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2714)
        at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.readEntity(SchemaRegistryClient.java:686)
        ... 47 common frames omitted


.

I expect that to resolve i would need to rewrite the nifi processor in a similar way to how i corrected the data loader jar