Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Datanode failed to start


Datanode failed to start


Recently i have upgraded the oracle jdk from 1.7 to 1.8u131.


After the upgrade. and starting the cluster getting below error. and value we have is dfs.datanode.max.locked.memory:3.36gb


java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(
2018-02-07 15:41:56,472 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2018-02-07 15:41:56,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:


Re: Datanode failed to start

Master Guru
What version of CDH are you facing this on, and is it managed by Cloudera Manager?

It appears that your DataNode is trying to start without native libraries, which causes it to have trouble with the HDFS Cache feature configuration.

Could you post the full startup log and the stderr from Cloudera Manager for the failing DataNode?