Created 07-19-2016 06:16 AM
I enabled Kerberos authentication for HDFS. NameNode and SNameNode are running and quering it through kerberos is OK.
The issue is for the DataNode, I have this error message
java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP. Using privileged resources in combination with SASL RPC data transfer protection is not supported. at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(DataNode.java:1217) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1103) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:432) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2423) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562) 2016-07-19 03:03:24,433 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1 2016-07-19 03:03:24,434 INFO datanode.DataNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at datanode.domain.com/192.168.1.3 ************************************************************
This is my DataNode configuration (hdfs-site.xml)
<!-- DataNode security config --> <property> <name>dfs.datanode.keytab.file</name> <value>/path/to/hdfs.keytab</value> </property> <property> <name>dfs.datanode.kerberos.principal</name> <value>hadoop/kerberos.domain.com@DOMAIN.COM</value> </property> <property> <name>dfs.datanode.address</name> <value>0.0.0.0:1004</value> </property> <property> <name>dfs.datanode.http.address</name> <value>0.0.0.0:1006</value> </property>
Following this answer I use an user called "ambari" with sudo for deploying HDP and Ambari Agent is running by root. Package JSVC is installed.
Thanks in advance.
Created 08-01-2016 11:33 PM
Hi @Facundo Bianco, you are using a privileged port number (1004) for data transfer so you cannot enable SASL. Please check your hdfs-site.xml to ensure SASL is not enabled via dfs.data.transfer.protection.
The Secure DataNode section from the Apache HDFS documentation describes this.
Since you are using HDP with Ambari, I recommend using the Ambari Kerberos Wizard especially if you are setting it up for the first time. At the very least it will provide you with a working reference configuration. The Ambari Kerberos Wizard is documented here:
Created 07-19-2016 08:11 PM
Did you upgrade your cluster recently? Can you please login to problematic datanode and check following?
ps aux|grep -i datanode
If you find any pid running the datanode process then kill it.
kill -9 <pid-of-datanode>
Try to start datanode with below command and let me know how it goes
/usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh start datanode
Created 08-01-2016 11:33 PM
Hi @Facundo Bianco, you are using a privileged port number (1004) for data transfer so you cannot enable SASL. Please check your hdfs-site.xml to ensure SASL is not enabled via dfs.data.transfer.protection.
The Secure DataNode section from the Apache HDFS documentation describes this.
Since you are using HDP with Ambari, I recommend using the Ambari Kerberos Wizard especially if you are setting it up for the first time. At the very least it will provide you with a working reference configuration. The Ambari Kerberos Wizard is documented here:
Created 07-07-2017 08:06 AM
Not working, followed same steps, Issue remains same