I'm trying to do a example integrating NiFi, Schema Registry, Kafka and Storm all together. For now, I've reached to integrate Nifi, Schema Registry and Kafka. But I'm stacked at the time of integrating with Storm.
I'm using the PublishKafkaRecord_0_11 and ConsumeKafkaRecord_0_11 with the "Attributes to Send as Headers" property so that the Consumer can know the "schema.name" to deserialize an avro message. It works fine in Nifi. So, I want to produce the messages with Nifi as I'm doing right now, but consume them with a Storm spout in java with the "kafka-storm" integration using the Headers as schema.name property. Is this possible?
Here you can see the template in Nifi... I would like to replace the Nifi consumer processor to a Storm Spout. nifi-kafka-sr.xml