Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Worker node '/' directory is full Alerts

Re: Worker node '/' directory is full Alerts

Explorer

@sagarshimpi  OK, Let us work on this and update you with the result.

Re: Worker node '/' directory is full Alerts

Expert Contributor

Hi @Ba  Did you test ? Do you have any update?

 

 

Re: Worker node '/' directory is full Alerts

Explorer

@sagarshimpi I'll update you once I do.

Thanks

Balu

Re: Worker node '/' directory is full Alerts

Explorer

Hi @sagarshimpi

 

Is there any way to list the HDFS files under DataDirectory level?

Re: Worker node '/' directory is full Alerts

Expert Contributor

@Ba  i didn't get you. can you please elaborate more.

Re: Worker node '/' directory is full Alerts

Explorer

@sagarshimpi For example if I have 5 DataNodes(DNs) and each DN has 10(Disks) DataDirectories individual mount points(Like /Disk1,/Disk2,......,/Disk10)

 

How to list If I want the list of which files blocks are stored in DN1/Disk2

 

Thanks

Balu

Re: Worker node '/' directory is full Alerts

Expert Contributor

Hi @Ba  You can probably fetch this information from hdfs cli  rather than going to individual datanode disk.

 

you can run -

hdfs fsck / -files -blocks -locations
-->This will give you entire files blocks information with locations on which datanodes the blocks are stored.

Just to segregatate wrt each datanodes you can apply some "grep" or "awk" filter

 

Hope this is what you were looking for.

Re: Worker node '/' directory is full Alerts

Explorer

My understanding is "the above command can give information with locations on which datanodes the blocks are stored."

 

But My question is that can it give the /Disk1 or /Disk2 level information also?

 

Example block1-DN2-Disk2

 

Thanks

Balu

Highlighted

Re: Worker node '/' directory is full Alerts

Super Mentor

@Ba 

I am not sure if following approach will solve your issue.

 

But the following article suggest that we can create and deploy a custom Alert Script to get more detailed information about the Disk usage.

 

Which Disk is consuming how much space may be you can enhance the logic a bit more to meet your exact requirement. 

https://community.cloudera.com/t5/Community-Articles/How-to-create-and-register-custom-ambari-alerts...

Don't have an account?
Coming from Hortonworks? Activate your account here