Support Questions

Find answers, ask questions, and share your expertise

namenode failed to start after enabling kerberos


Hi Team,


I am currently using HDP3.0 and ambari 2.7.3. I have enabled Kerberos from ambari.

KDC is installed and configured. Able to kinit and create principal.

I tried below thing,

kinit -kt /etc/security/keytabs/nn.service.keytab nn/
[root@mastern1 ~]# klist -e
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: nn/

Valid starting Expires Service principal
06/04/2020 20:19:07 06/05/2020 20:19:07 krbtgt/BMS.COM@BMS.COM
Etype (skey, tkt): aes256-cts-hmac-sha1-96, aes256-cts-hmac-sha1-96


[root@mastern1 ~]# cat /etc/krb5.conf

renew_lifetime = 7d
forwardable = true
default_realm = BMS.COM
ticket_lifetime = 24h
dns_lookup_realm = false
dns_lookup_kdc = false
default_ccache_name = /tmp/krb5cc_%{uid}
#default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5
#default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5
udp_preference_limit = 1

[domain_realm] = BMS.COM

default = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
kdc = FILE:/var/log/krb5kdc.log

admin_server =
kdc =



ERROR that i see below is


STARTUP_MSG: java = 1.8.0_252
2020-06-04 20:13:01,750 INFO namenode.NameNode ( - registered UNIX signal handlers for [TERM, HUP, INT]
2020-06-04 20:13:02,322 INFO namenode.NameNode ( - createNameNode []
2020-06-04 20:13:03,145 INFO impl.MetricsConfig ( - Loaded properties from
2020-06-04 20:13:05,390 INFO timeline.HadoopTimelineMetricsSink ( - Initializing Timeline metrics sink.
2020-06-04 20:13:05,390 INFO timeline.HadoopTimelineMetricsSink ( - Identified hostname =, serviceName = namenode
2020-06-04 20:13:06,516 WARN availability.MetricCollectorHAHelper ( - Unable to connect to zookeeper.
org.apache.ambari.metrics.sink.relocated.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /ambari-metrics-cluster
at org.apache.ambari.metrics.sink.relocated.zookeeper.KeeperException.create(
at org.apache.ambari.metrics.sink.relocated.zookeeper.KeeperException.create(
at org.apache.ambari.metrics.sink.relocated.zookeeper.ZooKeeper.exists(
at org.apache.ambari.metrics.sink.relocated.zookeeper.ZooKeeper.exists(
at org.apache.hadoop.metrics2.sink.timeline.availability.MetricCollectorHAHelper.findLiveCollectorHostsFromZNode(
at org.apache.hadoop.metrics2.sink.timeline.AbstractTimelineMetricsSink.findPreferredCollectHost(
at org.apache.hadoop.metrics2.sink.timeline.HadoopTimelineMetricsSink.init(
at org.apache.hadoop.metrics2.impl.MetricsConfig.getPlugin(
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.newSink(
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configureSinks(
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(
2020-06-04 20:13:07,547 INFO timeline.HadoopTimelineMetricsSink ( - No suitable collector found.
2020-06-04 20:13:07,551 INFO timeline.HadoopTimelineMetricsSink ( - RPC port properties configured: {8020=client}
2020-06-04 20:13:07,618 INFO impl.MetricsSinkAdapter ( - Sink timeline started
2020-06-04 20:13:08,229 INFO impl.MetricsSystemImpl ( - Scheduled Metric snapshot period at 10 second(s).
2020-06-04 20:13:08,229 INFO impl.MetricsSystemImpl ( - NameNode metrics system started
2020-06-04 20:13:08,480 INFO namenode.NameNodeUtils ( - fs.defaultFS is hdfs://
2020-06-04 20:13:08,480 INFO namenode.NameNode (<init>(928)) - Clients should use to access this namenode/service.
2020-06-04 20:13:10,005 ERROR namenode.NameNode ( - Failed to start namenode. failure to login: for principal: nn/ from keytab /etc/security/keytabs/nn.service.keytab Message stream modified (41)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loginAsNameNodeUser(
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(
Caused by: Message stream modified (41)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
at java.lang.reflect.Method.invoke(
at Method)
... 9 more
Caused by: KrbException: Message stream modified (41)
... 23 more
2020-06-04 20:13:10,011 INFO util.ExitUtil ( - Exiting with status 1: failure to login: for principal: nn/ from keytab /etc/security/keytabs/nn.service.keytab Message stream modified (41)
2020-06-04 20:13:10,132 INFO namenode.NameNode ( - SHUTDOWN_MSG:



When i start namenode service from ambari below is the message that i see

2020-06-04 20:13:01,851 - Waiting for this NameNode to leave Safemode due to the following conditions: HA: False, isActive: True, upgradeType: None
2020-06-04 20:13:01,852 - Waiting up to 19 minutes for the NameNode to leave Safemode...
2020-06-04 20:13:01,852 - Execute['/usr/hdp/current/hadoop-hdfs-namenode/bin/hdfs dfsadmin -fs hdfs:// -safemode get | grep 'Safe mode is OFF''] {'logoutput': True, 'tries': 115, 'user': 'hdfs', 'try_sleep': 10}
safemode: Call From to failed on connection exception: Connection refused; For more details see:
2020-06-04 20:13:14,639 - Retrying after 10 seconds. Reason: Execution of '/usr/hdp/current/hadoop-hdfs-namenode/bin/hdfs dfsadmin -fs hdfs:// -safemode get | grep 'Safe mode is OFF'' returned 1. safemode: Call From to failed on connection exception: Connection refused; For more details see:
safemode: Call From to failed on connection exception: Connection refused; For more details see:

Please help me in fixing the issue





Super Collaborator

@shrikant_bm Can you confirm java version



java version: 1.8.0_252

java path in .bashrc 

export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk/


I am not getting any help on this post. Any reason?

Is there anything missing in my post?

Please help me in providing some solution.



Super Collaborator

@shrikant_bm  Can you try changin "" to "" in file under JDK HOME on namenode host
Example: /usr/java/jdk1.8.0_252/jre/lib/security/ file.




@Scharan , after making the changes to true still i am facing the same issue.


Can you please help me with good article or document to install and configure kerberos and enable kerberos from ambari.


New Contributor

@shrikant_bmSimilar issue for me got resolved after removing the 'renew_lifetime' line /etc/krb5.conf.

The following link also provides additional information regarding this issue: