Support Questions

Find answers, ask questions, and share your expertise

How to check the total storage capacity of the cluster using edge node?

avatar
Rising Star

Storage limit of NN and total storage capacity?

1 ACCEPTED SOLUTION

avatar
Master Guru

Run sudo -u hdfs hdfs dfsadmin -report

This will give you full report on hdfs.

View solution in original post

5 REPLIES 5

avatar
Master Guru

Run sudo -u hdfs hdfs dfsadmin -report

This will give you full report on hdfs.

avatar
Master Guru

Report example

 sudo -u hdfs hdfs dfsadmin -report
Configured Capacity: 7504658432 (6.99 GB)
Present Capacity: 527142912 (502.72 MB)
DFS Remaining: 36921344 (35.21 MB)
DFS Used: 490221568 (467.51 MB)
DFS Used%: 93.00%
Under replicated blocks: 128
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 0

-------------------------------------------------
Live datanodes (1):

Name: 192.168.114.48:50010 (host-192-168-114-48.td.local)
Hostname: host-192-168-114-48.td.local
Decommission Status : Normal
Configured Capacity: 7504658432 (6.99 GB)
DFS Used: 490221568 (467.51 MB)
Non DFS Used: 6977515520 (6.50 GB)
DFS Remaining: 36921344 (35.21 MB)
DFS Used%: 6.53%
DFS Remaining%: 0.49%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 2

avatar
Master Guru

And one more

$ hadoop fs -df -h

Filesystem                                 Size     Used  Available  Use%
hdfs://host-192-168-114-48.td.local:8020  7.0 G  467.5 M     18.3 M    7%

avatar
Master Guru

Another option. Ambari has most of the metrics yoi are looking for. Simple from you edge node using curl call ambari api to fetch the stats you are looking for. Here is more in ambari api

https://cwiki.apache.org/confluence/display/AMBARI/Ambari+Metrics+API+specification

For example

http://:8080/api/v1/clusters//hosts//host_components/NAMENODE?fields=metrics/dfs/FSNamesystem/Capaci...

or you can view all metrics

http://:8080/api/v1/clusters//hosts//host_components/NAMENODE?fields=metrics

avatar
Expert Contributor

you can check this $hadoop dfsadmin -report as below . You can check without "ROOT" user as well.

:~> hadoop dfsadmin -report

Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it.

Configured Capacity: 95930 (87.50 TB) Present Capacity: 95869819 (87.93 TB)

DFS Remaining: 37094235 (33.37 TB)

DFS Used: 587755833 (53.56 TB)

DFS Used%: 61.31%

Under replicated blocks: 0

Blocks with corrupt replicas: 5

Missing blocks: 0 -------------------------------------------------

report: Access denied for user "username".

Superuser privilege is required :~>