Reply
Highlighted
Contributor
Posts: 66
Registered: ‎01-19-2017

Datanode failed to start

Recently i have upgraded the oracle jdk from 1.7 to 1.8u131.

 

After the upgrade. and starting the cluster getting below error. and value we have is dfs.datanode.max.locked.memory:3.36gb

 

java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1201)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:460)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2509)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2396)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2443)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2625)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2649)
2018-02-07 15:41:56,472 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2018-02-07 15:41:56,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:

Posts: 1,657
Kudos: 320
Solutions: 258
Registered: ‎07-31-2013

Re: Datanode failed to start

What version of CDH are you facing this on, and is it managed by Cloudera Manager?

It appears that your DataNode is trying to start without native libraries, which causes it to have trouble with the HDFS Cache feature configuration.

Could you post the full startup log and the stderr from Cloudera Manager for the failing DataNode?
Announcements