I am reading a Kafka topic with Avro files that use Confluent Schema Registry. To enable Snappy compression, I am:
Converting using ConverterRecord to read Avro with Confluent Schema Registry
Embed the Schema and Inherit Schema, compress with Snappy and write the record to the flow file
Then I created a MergeRecord to read the embedded Avro Schema and queue enough records to fit 128 MB and send to PutHDFS.
When I run the Processors, the records successfully convert but get stuck in the Success Queue till it fills up way past 128 MB and loading stops completely.
I've checked the App logs and there are no error messages.
Also if I check the List Queue on the Success, it says there are no flow files even though it lists 10000 records on the Queue itself.
I've also tried skipping the ConvertRecord and just Writing the embed Schema from the same Record Writer using the Read Kafka 1.0 and got the same result/
Some help would be greatly appreciated.