Created on 01-03-2018 05:44 AM - edited 09-16-2022 08:26 AM
Is there any upper limit for maximum capacity per node? Can data nodes scale to more than 100TB/node?
Created 03-17-2018 03:55 AM
Created 03-20-2018 10:40 AM
Hi,
For a Data node with 100TB of size, how much RAM is required ??
Created 03-20-2018 10:52 AM
That's mostly a function of blocks stored on a DataNode. For example, a rule of thumb is one GB heap size for DN for every one million blocks stored on that DN.
Created 03-20-2018 02:25 PM
Created 03-28-2019 01:25 PM
Can you provide more information on reporting load (for low-latency operations) issue when we have datanode with 100T+ storage? We need archive node for HDFS storage only purpose. No Yarn/spark running on it. It will only storage data based on storage migration policy. Node's network/storage IO bandwidth is considered be able to handle the larger storage size.