Created 02-09-2016 06:28 PM
I'm seeing "kafka.common.MessageSizeTooLargeException" in my GetKafka processor. Based on some searching, I've found that by changing my properties files to include "fetch.message.max.bytes=10485760", I can alleviate the issue when using the Kafka console consumer. I'm wondering if there is any where I can add this property in HDF to allow for large sized messages to be consumer.
Created 02-12-2016 02:03 AM
In the Get/PutKafka processors you should be able to add a dynamic property called 'fetch.message.max.bytes' and set the value you need. The processor should allow you to add dynamic properties which map to Kafka consumer properties and it will pass them to the consumer/producer config as needed.
Created 02-09-2016 07:02 PM
Created 02-09-2016 07:51 PM
@Eric OReilly You shouldn't put fetch.message.max.bytes
to config/server.properties
but to your ConsumerConfi
see this doc If you are using console consumer you may pass a --consumer.config consumer.properties
flag where consumer.properties file will contain this config value.
Created 02-12-2016 02:03 AM
In the Get/PutKafka processors you should be able to add a dynamic property called 'fetch.message.max.bytes' and set the value you need. The processor should allow you to add dynamic properties which map to Kafka consumer properties and it will pass them to the consumer/producer config as needed.