Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Is there any problem with the JVM of NiFi, is there any failover/ practice

avatar
Expert Contributor

Hi All,

Thanks a lot to this community.

I am working on building data flows, and i have 40 to 50 data flows to be configured. some of them are like 15000 events per second.

If I have some of the data flows configured and then a very high event per second data source is being configured, and some how fills up the JVM, everything crashes because of the disk space,

or sudden event burst occurs

Does it mean we will have to proactively watch for any space issues such as JVM heap size, content repo and provenance repo. etc

Thanks

Dheiru

1 ACCEPTED SOLUTION

avatar

Hi @dhieru singh

You need to do two things:

  • First, you need a good capacity planing to evaluate the required infrastructure that can handle your data flows. Consider the worst case scenario to have room for improvement and the capacity to manage bursts. There are several resources out there that can help you

https://community.hortonworks.com/articles/135337/nifi-sizing-guide-deployment-best-practices.html

https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.0/bk_command-line-installation/content/hdf_is...

View solution in original post

1 REPLY 1

avatar

Hi @dhieru singh

You need to do two things:

  • First, you need a good capacity planing to evaluate the required infrastructure that can handle your data flows. Consider the worst case scenario to have room for improvement and the capacity to manage bursts. There are several resources out there that can help you

https://community.hortonworks.com/articles/135337/nifi-sizing-guide-deployment-best-practices.html

https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.0/bk_command-line-installation/content/hdf_is...