@sagarshimpi For example if I have 5 DataNodes(DNs) and each DN has 10(Disks) DataDirectories individual mount points(Like /Disk1,/Disk2,......,/Disk10)
How to list If I want the list of which files blocks are stored in DN1/Disk2
Hi @Ba You can probably fetch this information from hdfs cli rather than going to individual datanode disk.
you can run -
hdfs fsck / -files -blocks -locations -->This will give you entire files blocks information with locations on which datanodes the blocks are stored. Just to segregatate wrt each datanodes you can apply some "grep" or "awk" filter
Hope this is what you were looking for.
My understanding is "the above command can give information with locations on which datanodes the blocks are stored."
But My question is that can it give the /Disk1 or /Disk2 level information also?
I am not sure if following approach will solve your issue.
But the following article suggest that we can create and deploy a custom Alert Script to get more detailed information about the Disk usage.
Which Disk is consuming how much space may be you can enhance the logic a bit more to meet your exact requirement.