Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDFS storage capacity usage


HDFS storage capacity usage


After executing spark-submit several times, I started getting cluster capacity usage alerts. Please see the attached screenshot.


I assume that the reason lies in logs. How can I clean the logs and free up the disk space?



Re: HDFS storage capacity usage

@Liana Napalkova You should open shell console to host eureambarislave1 for example and check the disk usage by running

# run command as root user or use sudo
# df -h
# du -d 1 -h /

This will show which mount point is running out of space. Depending on your disk partitions and mount points the space issue could come from data directories for hdfs, tmp or logs folder as you said.

Note: If you like to comment this post make sure you tag my name so I receive an update on my email. Also If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

Don't have an account?
Coming from Hortonworks? Activate your account here