Created 01-20-2017 09:00 PM
Hi,
I am using a HDP 2.4.2 cluster, since yesterday few of my datanodes heap size has been up continuously even when there are no jobs running. I have seen this article:
https://community.hortonworks.com/articles/74076/datanode-high-heap-size-alert.html
but not sure where exactly to make these changes??
do i go to hadoop-env template on ambari and change the HADOOP_DATANODE_OPTS=...
And once i do it, do i need to restart the HDFS service??
Thanks.
Created 01-20-2017 09:08 PM
Yes, your correct. you can modify HADOOP_DATANODE_OPTS parameters. Ambari will prompt if Restart is required. In this case, yes restart is needed
Created 01-20-2017 09:08 PM
Yes, your correct. you can modify HADOOP_DATANODE_OPTS parameters. Ambari will prompt if Restart is required. In this case, yes restart is needed
Created 01-20-2017 10:15 PM
@apappu is correct. These JVM options should be added to HADOOP_DATANODE_OPTS in the hadoop-env template.
After making the changes you should restart all the DataNodes (there is no need to restart the NameNodes). I recommend restarting DataNodes two at a time.
Created 09-27-2017 10:51 AM
Attempts to view that article result in an access denied error.
Created 01-25-2017 03:26 PM
Yes, I applied those parameter settings and all alerts are gone within few minutes. Thanks all.