Created 08-22-2016 11:56 AM
Hi,
We have the kerberos security enabled in Ambari. All services are running without any problems.
Now we are trying to connect through nfs with kerberos security.
mount -t nfs -o vers=3,proto=tcp,nolock,sync,noatime server-datalake:/ /mnt/hdfs
But with kerberos we get this error:
mount -t nfs -o sec=krb5,vers=3,noatime server-datalake:/ /mnt/hdfs -vvvv
mount.nfs: timeout set for Mon Aug 22 13:51:37 2016 mount.nfs: trying text-based options 'sec=krb5,vers=3,proto=tcp,nolock,addr=172.16.7.1' mount.nfs: prog 100003, trying vers=3, prot=6 mount.nfs: trying 172.16.7.1 prog 100003 vers 3 prot TCP port 2049 mount.nfs: prog 100005, trying vers=3, prot=6 mount.nfs: trying 172.16.7.1 prog 100005 vers 3 prot TCP port 4242 mount.nfs: mount(2): Permission denied mount.nfs: access denied by server while mounting server-datalake:/
What could be the problem?
Otherwise, configuring with kerberos the file view, we obtain this error:
500 Authentication required
Any help?
Thanks in advance
Created 08-22-2016 11:17 PM
Have you performed a kinit before attempting to mount the NFS drive? You will need a valid Kerberos ticket to use the NFS Gateway.
$ kinit <username> Enter password for <username>:
That could also cause the problem you are seeing with the file view. You need to have SPNEGO enabled and have your browser setup properly to pass Kerberos credentials for authentication (varies for your browser).
Created 08-23-2016 11:07 AM
Thanks @emaxwell that give me a clue.
But we still have problems.
Now we are trying to connect through a hadoop client. And we get the following error:
bash-4.1# kinit user Password for USER@MY.DOMAIN:
bash-4.1# /usr/local/hadoop/bin/hadoop fs -ls hdfs://server-datalake/data -vv ls: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] ls: Call From c43852c98eb6/172.17.0.2 to server-datalake:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
And about file view, we still have problems.
Created 08-23-2016 12:16 PM
Maybe the problem is in the configuration of the client. Because if we do hadoop fs -ls in the server we don't get any errors.
It's been a nightmare! We only want to have a secure way (through kerberos) in order to leave files in ambari server!!!
Created 08-23-2016 01:41 PM
Have you updated your core-site.xml and hdfs-site.xml on the client? Your client is trying to use simple auth instead of kerberos.