Created 03-14-2016 09:06 AM
Hi,
It looks like i reached the memory limit for my sandbox. However i can't delete files since the Safemode is On. So i tried to turn it off manually with :
hdfs dfsadmin -safemode leave
but it makes nothing. From what i understood, it is because the memory is still full when i turn it off so the system immediatly goes back to safemode.
How can i delete some files?
Created 03-14-2016 09:21 AM
You first need to check which and all process are using more amount of memory. You can get using below command -
1. Issue command #top
2. press 'm'
3. check for top 2-3 processes in column '%MEM'
Also i will suggest to check free memory using 'free -m' command-
Most of the time memory us getting catched up. You can remove cache using below command -
#echo 3 > /proc/sys/vm/drop_caches
For deleting files you can , delete all log files for hadoop located in /var/log/hadoop/<application-name>
Do let me know if it works.
Created 03-14-2016 09:21 AM
You first need to check which and all process are using more amount of memory. You can get using below command -
1. Issue command #top
2. press 'm'
3. check for top 2-3 processes in column '%MEM'
Also i will suggest to check free memory using 'free -m' command-
Most of the time memory us getting catched up. You can remove cache using below command -
#echo 3 > /proc/sys/vm/drop_caches
For deleting files you can , delete all log files for hadoop located in /var/log/hadoop/<application-name>
Do let me know if it works.
Created 03-14-2016 10:16 AM
Created 03-14-2016 09:59 AM
Go to ambari dashboard and stop the services that you don't need and turn on the maintenance mode for them.
then to save time, restart the vm.
Created 03-14-2016 10:22 AM
Thank you for your answer. Unfortunately i can't access ambari dashboard. For some reason, the username/password i've been using for one month doesn't work anymore. I thought it was related to this safemode thing but apparently it wasn't..