Expert Contributor
Posts: 361
Registered: ‎01-25-2017

Unable to Start DataNode in kerberos cluster

Hi Guys,


I'm unable to start DataNode after enabling the kerberos in my cluster.


I tried all the suggested solutions in the community and Internet and without any success to solve it.


All other servers started and my cluster and node able to authinticate against the active directory.


Here the important confige in the HDFS:


dfs.datanode.http.address 1006
dfs.datanode.address 1004 kerberos true authentication
Enable Kerberos Authentication for HTTP Web-Consoles true


and here is the log:


STARTUP_MSG:   java = 1.8.0_101
2017-10-23 06:56:02,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2017-10-23 06:56:03,449 INFO Login successful for user hdfs/ using keytab file hdfs.keytab
2017-10-23 06:56:03,812 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
2017-10-23 06:56:03,891 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2017-10-23 06:56:03,891 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2017-10-23 06:56:03,899 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2017-10-23 06:56:03,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled.
2017-10-23 06:56:03,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is
2017-10-23 06:56:03,908 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP.  Using privileged resources in combination with SASL RPC data transfer protection is not supported.
at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(
2017-10-23 06:56:03,919 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2017-10-23 06:56:03,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
SHUTDOWN_MSG: Shutting down DataNode at
2017-10-23 06:56:08,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   user = cloudera-scm
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.6.0-cdh5.13.0=======================
Expert Contributor
Posts: 361
Registered: ‎01-25-2017

Re: Unable to Start DataNode in kerberos cluster

Any isight guys?

Expert Contributor
Posts: 361
Registered: ‎01-25-2017

Re: Unable to Start DataNode in kerberos cluster

Pleeeeeeeeeeeeeeeeeeeeeeeeeeeease help !!!!!!

Expert Contributor
Posts: 361
Registered: ‎01-25-2017

Re: Unable to Start DataNode in kerberos cluster

Any help Guys,


I'm stuck serveral months on this !!!


searched all the web for a solution with no success