Created 10-23-2017 09:14 PM
Hi All,
Thanks a lot to this community.
I am working on building data flows, and i have 40 to 50 data flows to be configured. some of them are like 15000 events per second.
If I have some of the data flows configured and then a very high event per second data source is being configured, and some how fills up the JVM, everything crashes because of the disk space,
or sudden event burst occurs
Does it mean we will have to proactively watch for any space issues such as JVM heap size, content repo and provenance repo. etc
Thanks
Dheiru
Created 10-24-2017 04:44 AM
You need to do two things:
https://community.hortonworks.com/articles/135337/nifi-sizing-guide-deployment-best-practices.html
Created 10-24-2017 04:44 AM
You need to do two things:
https://community.hortonworks.com/articles/135337/nifi-sizing-guide-deployment-best-practices.html