Support Questions
Find answers, ask questions, and share your expertise

Problems with kerberos (NFS and File view)



We have the kerberos security enabled in Ambari. All services are running without any problems.

Now we are trying to connect through nfs with kerberos security.

mount -t nfs -o vers=3,proto=tcp,nolock,sync,noatime server-datalake:/ /mnt/hdfs

But with kerberos we get this error:

mount -t nfs -o sec=krb5,vers=3,noatime server-datalake:/ /mnt/hdfs -vvvv 
mount.nfs: timeout set for Mon Aug 22 13:51:37 2016 
mount.nfs: trying text-based options 'sec=krb5,vers=3,proto=tcp,nolock,addr=' 
mount.nfs: prog 100003, trying vers=3, prot=6 
mount.nfs: trying prog 100003 vers 3 prot TCP port 2049 
mount.nfs: prog 100005, trying vers=3, prot=6 
mount.nfs: trying prog 100005 vers 3 prot TCP port 4242
mount.nfs: mount(2): Permission denied
mount.nfs: access denied by server while mounting server-datalake:/

What could be the problem?

Otherwise, configuring with kerberos the file view, we obtain this error:

500 Authentication required

Any help?

Thanks in advance


@Blanca Sanz

Have you performed a kinit before attempting to mount the NFS drive? You will need a valid Kerberos ticket to use the NFS Gateway.

$ kinit <username>
Enter password for <username>:

That could also cause the problem you are seeing with the file view. You need to have SPNEGO enabled and have your browser setup properly to pass Kerberos credentials for authentication (varies for your browser).


Thanks @emaxwell that give me a clue.

But we still have problems.

Now we are trying to connect through a hadoop client. And we get the following error:

bash-4.1# kinit user
Password for USER@MY.DOMAIN:
bash-4.1# /usr/local/hadoop/bin/hadoop fs -ls hdfs://server-datalake/data -vv
ls: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
ls: Call From c43852c98eb6/ to server-datalake:9000 failed on connection exception: Connection refused; For more details see:

And about file view, we still have problems.


Maybe the problem is in the configuration of the client. Because if we do hadoop fs -ls in the server we don't get any errors.

It's been a nightmare! We only want to have a secure way (through kerberos) in order to leave files in ambari server!!!

@Blanca Sanz

Have you updated your core-site.xml and hdfs-site.xml on the client? Your client is trying to use simple auth instead of kerberos.