- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
NiFi 1.3 - PublishKafka_0_10 - A message in the stream exceeds the maximum allowed message size of 1048576 bytes
- Labels:
-
Apache NiFi
Created ‎11-17-2017 05:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2017-11-17 11:07:47,966 ERROR [Timer-Driven Process Thread-4] o.a.n.p.kafka.pubsub.PublishKafka_0_10 PublishKafka_0_10[id=e6d932d9-97ae-1647-aa8f-86d07791ce25] Failed to send all message for StandardFlowFileRecord[uuid=fa2399e5-bea5-4113-b58b-6cdef228733c,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1510934860019-132, container=default, section=132], offset=0, length=2160613],offset=0,name=12337127439954063,size=2160613] to Kafka; routing to failure due to org.apache.nifi.stream.io.exception.TokenTooLargeException: A message in the stream exceeds the maximum allowed message size of 1048576 bytes.: {}
org.apache.nifi.stream.io.exception.TokenTooLargeException: A message in the stream exceeds the maximum allowed message size of 1048576 bytes.
at org.apache.nifi.stream.io.util.AbstractDemarcator.extractDataToken(AbstractDemarcator.java:157)
at org.apache.nifi.stream.io.util.StreamDemarcator.nextToken(StreamDemarcator.java:129)
at org.apache.nifi.processors.kafka.pubsub.PublisherLease.publish(PublisherLease.java:78)
at org.apache.nifi.processors.kafka.pubsub.PublishKafka_0_10$1.process(PublishKafka_0_10.java:334)
at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2136)
at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2106)
at org.apache.nifi.processors.kafka.pubsub.PublishKafka_0_10.onTrigger(PublishKafka_0_10.java:330)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)Thanks and Regards
Created ‎11-21-2017 04:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Verify whether message.max.bytes (or max.message.bytes for the topic) is set to an appropriate value to support the large messages. And to enable the consumers to read from this topic, set fetch.message.max.bytes as well.
Created ‎11-21-2017 04:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Verify whether message.max.bytes (or max.message.bytes for the topic) is set to an appropriate value to support the large messages. And to enable the consumers to read from this topic, set fetch.message.max.bytes as well.
Created ‎11-21-2017 06:24 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks this solution worked.
Created ‎05-01-2020 06:29 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi , where did you found the property he talked about to modify it ? because I'm facing the same issue ? thank in advance
