- last edited on
on my cloudera hadoop cluster every day 1TB data is loading and it has almost 17,811,802 data blocks. Right now the performance is good.
I want to know and calculate for how many data blcoks my cluster will be support ? without performance issue.
In one of the Blog i read for 1000000 data blockes 1 GB ram required from name node is it true ?
we have 18 Node cluster every node in my cluster has 380 GB RAM.
Good reference 🙂