Reply
Contributor
Posts: 60
Registered: ‎01-19-2017

Datanode failed to start

Recently i have upgraded the oracle jdk from 1.7 to 1.8u131.

 

After the upgrade. and starting the cluster getting below error. and value we have is dfs.datanode.max.locked.memory:3.36gb

 

java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1201)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:460)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2509)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2396)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2443)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2625)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2649)
2018-02-07 15:41:56,472 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2018-02-07 15:41:56,474 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:

Announcements