I am facing critical warning on CDH manager interface for log directory
This role's log directory is on a filesystem with less than 5.0 GiB of its space free. /var/log/hadoop-hdfs (free: 119.0 MiB (0.24%), capacity: 49.1 GiB).While on my system I can see I have the root directory filled , but i do have space in home directory.
I would like to know how and what prorpoerty i need to chnage to make log to /home instead of small root . i didn't get any link to fix this issue ,if you can just point me to write info it will be really helpful here
[root@hadoop-vm2 subdir0]# df -h
Filesystem Size Used Avail Use% Mounted on
50G 47G 111M 100% /
tmpfs 15G 8.0K 15G 1% /dev/shm
/dev/sda1 477M 63M 389M 14% /boot
742G 55G 650G 8% /home
cm_processes 15G 5.3M 15G 1% /var/run/cloudera-scm-agent/process
My issue is I think with filled space , after troubleshooting i found i was earlier using /dfs/dn for HDFS block storage , later i added non OS partition under /home (/home/hdfs/dfs/dn) and then started importing 100 of GB data. Looks like some how my old path /dfs/dn had also sotred some of HDFS blocks and filled that root partition. Sais so if now by chnagging the configuration remove (/dfs/dn) dfs.data.dir and restart cluster will it do automatic move data to only left location /home/hdfs/dfs/dn or how to handle that. I guess this will fix my problem for now.
Do not worry about data much what ever best and quick will be fine .
[root@hadoop-vm2 /]# du -sh ./*
du: cannot access `./proc/20676/task/20676/fd/4': No such file or directory
du: cannot access `./proc/20676/task/20676/fdinfo/4': No such file or directory
du: cannot access `./proc/20676/fd/4': No such file or directory
du: cannot access `./proc/20676/fdinfo/4': No such file or directory
[root@hadoop-vm2 /]# ls /dfs/dn/current/