Support Questions

Find answers, ask questions, and share your expertise

PublishKafka is failing with TokenTooLargeException error

Hi, I am trying to publish data from GetHDFS processor with files from HDFS but I am getting below error for PublishKafka processor (Kafka 0.9.x Producer)...Can someone tell me what do I need to do to rectify the error,

2017-12-01 13:21:19,774 ERROR [Timer-Driven Process Thread-8] o.a.n.p.kafka.pubsub.PublishKafka PublishKafka[id=0d4a2dd1-0160-1000-d585-dc9175f6b24d] Failed to send all message for StandardFlowFileRecord[uuid=90ca50f8-5e6a-4ecf-afc9-c553f27ad4af,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1512156079218-30, container=default, section=30], offset=0, length=5011469],offset=0,name=sample_b.csv,size=5011469] to Kafka; routing to failure due to A message in the stream exceeds the maximum allowed message size of 1048576 bytes.: {} A message in the stream exceeds the maximum allowed message size of 1048576 bytes.

Here is the configuration sessting for the processor:



@Pallavi Ab

There is a message that is larger than 1 MB which is the default maximum size allowed. You either need to increase the default size to allow your larger messages or reduce the size of messages to less than 1 MB.

@Pallavi Ab

The default maximum message size for a Kafka Broker is 1MB. Before you can publish a message larger than 1MB, you will have to change the configuration of the the Kafka Broker you want to write to with NiFi.

That must be done outside of NiFi. Here is a link to apache kafka documentation: Kafka documentation

Make sure you are looking at the version of the broker that matches your environment.


Thank you for the reply.

I am trying to read around GB of data and load into topic. These files contains new record each line. How high I can go with message size? or shall I change something else in my configuration in order to make it work?

Here is how the current configuration looks:


Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.