Support Questions
Find answers, ask questions, and share your expertise

SAM Application Kafka Source Fails

New Contributor

I see this error in the Storm UI for my SAM application:

com.hortonworks.registries.schemaregistry.serde.SerDesException: Unknown protocol id [79] received while deserializing the payload at com.hortonworks.registries.schemaregistry.serdes.avro.AvroSnapshotDeserializer.retrieveProtocolId( at com.hortonworks.registries.schemaregistry.serdes.avro.AvroSnapshotDeserializer.retrieveProtocolId( at com.hortonworks.registries.schemaregistry.serde.AbstractSnapshotDeserializer.deserialize( at com.hortonworks.streamline.streams.runtime.storm.spout.AvroKafkaSpoutTranslator.apply( at org.apache.storm.kafka.spout.KafkaSpout.emitTupleIfNotEmitted( at org.apache.storm.kafka.spout.KafkaSpout.emit( at org.apache.storm.kafka.spout.KafkaSpout.nextTuple( at org.apache.storm.daemon.executor$fn__5136$fn__5151$fn__5182.invoke(executor.clj:647) at org.apache.storm.util$async_loop$fn__553.invoke(util.clj:484) at at




Super Guru
@Brad Penelli

Seems like a schema registry issue. Is the schema name specified in registry correct (meaning no typos). I will also not put any dashes or special characters in schema name. If everything else is right, then simply restart schema regstry. That seemed to solve my problem.

@Brad Penelli

Did you get to the bottom of this error? I'm hitting the same problem, except that the protocol id is always 2.



Super Guru

Schema is very picky. Must match all the names and verify. Also cannot have dashes, spaces or any characters. Stick to letters and numbers.

New Contributor

It's not the dashes. It's expecting a Protocol Id to be specified in the first byte of the payload...

Here is the code associated with the error (highlighted below).



Super Guru

this is critical HWX-Content Encoded Schema Reference


@Timothy Spann @Brad Penelli

FYI, I was able to get around this problem. In my case, the records were written by NiFi on Kafka using the PublishKafkaRecord processor configured with the record writer set as AvroRecordSetWriter. The AvroRecordSetWriter controller service itself was configured to 'Use Schema Name' for the Schema Write strategy.

The stack in the error above shows that it's expecting another type of metadata, including a protocol id. After setting the Schema Write strategy to 'HWX-Content Encoded Schema Reference', stopping the NiFI app, deleting/recreating the Kafka queue (deleting messages would also work), and restarting it, it worked again.

Attached is a screen grab of the AvroRecordSetWriter Controller Service: nifi.png