Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Datanode failed to start

Highlighted

Datanode failed to start

Contributor

Recently i have upgraded the oracle jdk from 1.7 to 1.8u131.

 

After the upgrade. and starting the cluster getting below error. and value we have is dfs.datanode.max.locked.memory:3.36gb

 

java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1201)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:460)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2509)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2396)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2443)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2625)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2649)
2018-02-07 15:41:56,472 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2018-02-07 15:41:56,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:

1 REPLY 1

Re: Datanode failed to start

Master Guru
What version of CDH are you facing this on, and is it managed by Cloudera Manager?

It appears that your DataNode is trying to start without native libraries, which causes it to have trouble with the HDFS Cache feature configuration.

Could you post the full startup log and the stderr from Cloudera Manager for the failing DataNode?