Support Questions

Find answers, ask questions, and share your expertise

Datanode boot failed with enabling kerberos

New Contributor

CDH version : 2.6.0-cdh5.11.2

kerberos version : 1.10.3

system : centos6.8

According to official document, I set variables for secure dataNodes, info is as follows

 

export HADOOP_SECURE_DN_USER=hdfs
export HADOOP_SECURE_DN_PID_DIR=/var/lib/hadoop-hdfs
export HADOOP_SECURE_DN_LOG_DIR=/var/log/hadoop-hdfs
export JSVC_HOME=/usr/lib/bigtop-utils/

 

 

However,it was failture when datanode service was restarted. 

11.png

Finaly, i found that the startup script and variable 'export HADOOP_SECURE_DN_USER=hdfs' were conflicting.

2222.png

For the above questions, i have no idea how to do, cloud anyone help me? thanks.

1 REPLY 1

Expert Contributor

Hi,

are you using an environment managed by Cloudera Manager? Setting parameters such as the secure datanode user should not be required when using the Cloudera-Manager-guided Kerberos path. Please refer to https://www.cloudera.com/documentation/enterprise/5-11-x/topics/cm_sg_intro_kerb.html

 

Please note that using root for Datanodes was required in HDFS to bind to privileged ports (<1024) in order to protect against attackers spinning up rouge DataNodes inside of YARN jobs. With SASL protection and SSL this is not required anymore and the user is "hdfs" again.

 

Regards

Benjamin