Created on 08-15-2017 07:27 PM - edited 08-17-2019 07:25 PM
I have installed HDFS 3.0.1. I am using a PublishKafkaRecord processor in NiFI to access a schema via HortonworksSchemaRegistry.
I am getting the below error from PutKafkaRecord.
Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "validationLevel" (class com.hortonworks.registries.schemaregistry.SchemaMetadata), not marked as ignorable (6 known properties: "compatibility", "type", "name", "description", "evolve", "schemaGroup"]) at [Source: {"schemaMetadata":{"type":"avro","schemaGroup":"test","name":"simple","description":"simple","compatibility":"BACKWARD","validationLevel":"ALL","evolve":true},"id":3,"timestamp":1502815970781}; line: 1, column: 140] (through reference chain: com.hortonworks.registries.schemaregistry.SchemaMetadataInfo["schemaMetadata"]->com.hortonworks.registries.schemaregistry.SchemaMetadata["validationLevel"])
How do I get NiFi to ignore the validationLevel attribute for the schema and not throw this error?
Created 08-18-2017 01:57 PM
This is a bug in the HDF 3.0.1 release related to the integration of NiFi and Schema Registry.
It will be fixed in a future release. For now you could stick with 3.0.0 if you need this to work.
Created 08-16-2017 03:52 PM
can you post the schema? can you post an example CSV? are you running kafka locally? i see a localhost link there.
You must use the same version of NiFi and Schema Registry they must be from the same HDF 3.x release. So you can't download nifi.apache.org NiFi and try that.
Created 08-16-2017 03:56 PM
What is your schema access strategy on the service controller for your record writer (AvroRecordSetWriter) ?
Created on 08-16-2017 05:16 PM - edited 08-17-2019 07:25 PM
You have to set the schema name as an attribute
make sure you create your kafka topic
Make sure you setup csv reader and avro writer correctly
Set the schema, setting content-type doesn't hurt
Add schema to hortonworks schema registry, match those names simple
set the reader and writer, server and topic
configure the reader
configure the writer
create a topic before hand and then you can consume it after it get's pushed through NiFi
./kafka-console-consumer.sh --zookeeper princeton10.field.hortonworks.com:2181 --topic simple Using the ConsoleConsumer with old consumer is deprecated and will be removed in a future major release. Consider using the new consumer by passing [bootstrap-server] instead of [zookeeper].{metadata.broker.list=princeton10.field.hortonworks.com:6667, request.timeout.ms=30000, client.id=console-consumer-7447, security.protocol=PLAINTEXT}aabaabb
Created 08-18-2017 01:57 PM
This is a bug in the HDF 3.0.1 release related to the integration of NiFi and Schema Registry.
It will be fixed in a future release. For now you could stick with 3.0.0 if you need this to work.
Created 08-24-2017 12:08 PM
Bug is fixed in 3.0.1.1 patch and issue is resolved. Release notes as follows:
Created on 09-01-2017 02:21 AM - edited 08-17-2019 07:24 PM
i have install HDF 3.0.0 and 3.0.1.1 then i coudn't choose the "storm service" (it could be replaced by null service) when i install nifi, SAM, Registry services.
so i can't complete the deployment
Created 12-08-2017 07:42 PM
@Bryan Bende & @Greg Keys, I am running version 3.0.1.1 of HDF and I still receive this error when trying to run the streaming analytics demo. Is there a new version of the simulator available or additional steps required beyond running version 3.0.1.1?
Created 02-14-2018 02:40 PM
@Bryan Bende & @Greg Keys the error seems to persist with HDF 3.0.2. Are there any news on a workaround or a fix?
Created 08-18-2017 06:24 PM
Hi all.
The main issue from what i can gather is that the validationlevel column in the schema meta data table is a recent addition. -- added in 0.3.1. I suspect the NIFI processors shipped with HDF 3.0.1 are not correctly configured to deal with it
I first ran into the same error while attempting the streaming analytics demo https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.1.1/bk_getting-started-with-stream-analytics/...
In the demo we are asked to run a data generator stream-simulator-jar-with-dependencies.jar to generate data and publish to predefined kafka topics.
nifi is tasked to consume form these topics before enrichment and publishing to new kafka topics.
I first ran into the registry error discussed above while running the stream-simulator-jar-with-dependencies.jar. To resolve i had to extract the jar
modify the SchemaMetadata.class as follows
package com.hortonworks.registries.schemaregistry; import com.fasterxml.jackson.annotation.JsonIgnoreProperties; import com.fasterxml.jackson.annotation.JsonProperty; import com.google.common.base.Preconditions; import com.hortonworks.registries.schemaregistry.SchemaCompatibility; import java.io.Serializable; @JsonIgnoreProperties( ignoreUnknown = true )
recompile the SchemaMetadata.class and repackage the jar.
The stream-simulator-jar-with-dependencies.jar should now write to kafka as desired.
So now my data generator works..
EventSimulator-akka.actor.default-dispatcher-2 INFO collectors.KafkaEventSerializedWithRegistryCollector - {"eventTime": "2017-08-17 09:13:10.658", "eventSource": "truck_geo_event", "truckId": 24, "driverId": 18, "driverName": "Tom McCuch", "routeId": 14, "route": "Springfield to KC Via Hanibal", "eventType": "Normal", "latitude": 39.78, "longitude": -89.66, "correlationId": 1} EventSimulator-akka.actor.default-dispatcher-2 INFO collectors.KafkaEventSerializedWithRegistryCollector - {"eventTime": "2017-08-17 09:13:10.658", "eventSource": "truck_geo_event", "truckId": 24, "driverId": 18, "driverName": "Tom McCuch", "routeId": 14, "route": "Springfield to KC Via Hanibal", "eventType": "Normal", "latitude": 39.78, "longitude": -89.66, "correlationId": 1} EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck geo event[org.apache.avro.generic.GenericData$Record@1434ef29[schema={"type":"record","name":"truckgeoevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"eventType","type":"string"},{"name":"latitude","type":"double"},{"name":"longitude","type":"double"},{"name":"correlationId","type":"long"}]},values={2017-08-17 09:13:10.658,truck_geo_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,Normal,39.78,-89.66,1}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal] EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck geo event[org.apache.avro.generic.GenericData$Record@1434ef29[schema={"type":"record","name":"truckgeoevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"eventType","type":"string"},{"name":"latitude","type":"double"},{"name":"longitude","type":"double"},{"name":"correlationId","type":"long"}]},values={2017-08-17 09:13:10.658,truck_geo_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,Normal,39.78,-89.66,1}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal] EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck speed event[org.apache.avro.generic.GenericData$Record@5ea9a8b8[schema={"type":"record","name":"truckspeedevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"speed","type":"int"}]},values={2017-08-17 09:13:10.66,truck_speed_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,59}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal] EventSimulator-akka.actor.default-dispatcher-2 DEBUG collectors.KafkaEventSerializedWithRegistryCollector - Creating Avro truck speed event[org.apache.avro.generic.GenericData$Record@5ea9a8b8[schema={"type":"record","name":"truckspeedevent","namespace":"hortonworks.hdp.refapp.trucking","fields":[{"name":"eventTime","type":"string"},{"name":"eventSource","type":"string"},{"name":"truckId","type":"int"},{"name":"driverId","type":"int"},{"name":"driverName","type":"string"},{"name":"routeId","type":"int"},{"name":"route","type":"string"},{"name":"speed","type":"int"}]},values={2017-08-17 09:13:10.66,truck_speed_event,24,18,Tom McCuch,14,Springfield to KC Via Hanibal,59}]] for driver[18] in truck [24|18|Tom McCuch|14|Springfield to KC Via Hanibal]
.However attempting to consume from the kafka topic with NIFI is still trowing errors
aused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "validationLevel" (class com.hortonworks.registries.schemaregistry.SchemaMetadata), not marked as ignorable (6 known properties: "compatibility", "type", "name", "description", "evolve", "schemaGroup"]) at [Source: {"schemaMetadata":{"type":"avro","schemaGroup":"truck-sensors-kafka","name":"raw-truck_events_avro","description":"Raw Geo events from trucks in Kafka Topic","compatibility":"BACKWARD","validationLevel":"ALL","evolve":true},"id":1,"timestamp":1502972321149}; line: 1, column: 205] (through reference chain: com.hortonworks.registries.schemaregistry.SchemaMetadataInfo["schemaMetadata"]->com.hortonworks.registries.schemaregistry.SchemaMetadata["validationLevel"]) at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:51) at com.fasterxml.jackson.databind.DeserializationContext.reportUnknownProperty(DeserializationContext.java:836) at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1045) at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1352) at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1330) at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:262) at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125) at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520) at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:101) at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:256) at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125) at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3702) at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2714) at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.readEntity(SchemaRegistryClient.java:686) ... 47 common frames omitted
.
I expect that to resolve i would need to rewrite the nifi processor in a similar way to how i corrected the data loader jar