Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Caused by: java.io.IOException: All Partitions have been blacklisted due to failures when attempting to update.

avatar
Expert Contributor

Hi All,

Thanks,

I have a pfsense firewall, which was sending data at rate of 10000 events per second. when I switched it in PROD, I was not able to match the tuning between ListenUDP and MergeContent processor.

I got this error

"Caused by: java.io.IOException: All Partitions have been blacklisted due to failures when attempting to update. If the Write-Ahead Log is able to perform a checkpoint, this issue may resolve itself. Otherwise, manual intervention will be required."

Thanks

Dheeru

1 ACCEPTED SOLUTION

avatar
Super Mentor
@dhieru singh

This issue is likely being caused by https://issues.apache.org/jira/browse/NIFI-3389.

The above bug was addressed stating in Apache NiFi 1.2.0 and HDF 2.1.2.

The bug occurs when a NiFi attribute being written to the FlowFile repository is larger then 64 KB.

This can usually map back to specific dataflow the end-user has designed that uses some NiFi processor capable of extracting actual content in to a FlowFile Attribute such as ExtractText or EvaluateJsonPath.

Users must upgrade to newer version of NiFi with this fix or redesign their dataflow so they are not creating attributes larger then 64KB.

Thank you,

Matt

View solution in original post

3 REPLIES 3

avatar
Super Mentor
@dhieru singh

This issue is likely being caused by https://issues.apache.org/jira/browse/NIFI-3389.

The above bug was addressed stating in Apache NiFi 1.2.0 and HDF 2.1.2.

The bug occurs when a NiFi attribute being written to the FlowFile repository is larger then 64 KB.

This can usually map back to specific dataflow the end-user has designed that uses some NiFi processor capable of extracting actual content in to a FlowFile Attribute such as ExtractText or EvaluateJsonPath.

Users must upgrade to newer version of NiFi with this fix or redesign their dataflow so they are not creating attributes larger then 64KB.

Thank you,

Matt

avatar
New Contributor

I am running into an error identical to this one on NiFi 1.11.4. Is it possible that some form of this bug has persisted, or is the cause likely to be different?

avatar
Community Manager

@jumble as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community: