Support Questions
Find answers, ask questions, and share your expertise

Datnanode fails after enabling Kerberos

New Contributor

Hello Everybody,

I wanted to enable Kerberos on a running Server (CentOs 7.5) with Ambari
I've decided to do this with the Kerberos Wizard inside Ambari as described here: Launching the Kerberos Wizard
Therefore I installed and setup a new MIT KDC as it is described in the Documents: Install a new MIT KDC

In the final Step of the Wizard the Datanode fails to start with the following Output Error.

Traceback (most recent call last):

  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HDFS/package/scripts/", line 126, in <module>


  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/", line 354, in execute

    self.execute_prefix_function(self.command_name, 'post', env)

  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/", line 378, in execute_prefix_function


  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/", line 420, in post_start

    raise Fail("Pid file {0} doesn't exist after starting of the component.".format(pid_file))

resource_management.core.exceptions.Fail: Pid file /var/run/hadoop/hdfs/ doesn't exist after starting of the component.

When I looked in the log File of the Datanode it was empty for the given timestamp of the starting attempt.

If I disable Kerberos again all Services are starting as expected.

Does anyone have an idea or hint how I can enable Kerberos with a running Datanode afterwards?
I'm grateful for any help I'm given.


Super Mentor

@Philipp Schoder

After kerberizing cluster , are you able to start the DataNode Manually?

Ideally in a kerberized environment you should see a PID file with the following name "/var/run/hadoop/hdfs/" instead of "/var/run/hadoop/hdfs/"



Can you check if you can create an empty file with the following permission for the hdfs user?

# ls -l /var/run/hadoop/hdfs/
-rw-r--r--. 1 hdfs hadoop 6 Oct 29 06:26 /var/run/hadoop/hdfs/


New Contributor

@Jay Kumar SenSharma

No, if I try to start the DataNode in the Ambari Web Interface it fails with the same result.

Yes that works, I can create the empty file as hdfs user.

; ;