- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
HDFS capacity
- Labels:
-
Apache Hadoop
Created ‎03-17-2016 07:27 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
our HDFS cluster size is 16 TB. have 3 data nodes and 1 name node.
1.how to find out the storage size alloted for 3 data nodes ?
2. Does name node uses only RAM memory ? how to find its capacity as well
Created ‎03-17-2016 09:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can check data nodes hdfs configured capacity by
1. Going into namenode UI http://namenodeIp:50070/dfshealth.html#tab-datanode
2. from command line use hadoop dfsadmin -report
Created ‎03-17-2016 09:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can check data nodes hdfs configured capacity by
1. Going into namenode UI http://namenodeIp:50070/dfshealth.html#tab-datanode
2. from command line use hadoop dfsadmin -report
Created on ‎03-17-2016 09:55 AM - edited ‎08-19-2019 02:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Namenode ram capacity can be found in namenode ui only
Created ‎03-18-2016 09:50 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks much. Name node uses only RAM memory?..no space is required as such data node ?
Created ‎03-22-2016 04:39 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Ram Note that disks are required for NN also. See post related to sizing of NN. https://community.hortonworks.com/questions/1692/any-recommendation-on-how-to-partition-disk-space.h...
