Posts: 72
Registered: ‎01-19-2017

Datanode failed to start

Recently i have upgraded the oracle jdk from 1.7 to 1.8u131.


After the upgrade. and starting the cluster getting below error. and value we have is dfs.datanode.max.locked.memory:3.36gb


java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(
2018-02-07 15:41:56,472 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2018-02-07 15:41:56,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:

Posts: 1,885
Kudos: 424
Solutions: 299
Registered: ‎07-31-2013

Re: Datanode failed to start

What version of CDH are you facing this on, and is it managed by Cloudera Manager?

It appears that your DataNode is trying to start without native libraries, which causes it to have trouble with the HDFS Cache feature configuration.

Could you post the full startup log and the stderr from Cloudera Manager for the failing DataNode?