Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

ambari Views fail after Kerberos Enabled.

avatar
Contributor

Hi,

I am not able to view any ambari views except Yarn after kerberos enabled. I dont have any proxy users setup and just have ambari server.

Any suggestions Please :

How to configure after kerberos enabled:

Hive View :

Issues detected
Service 'ats' check failed: Server ErrorService 'userhome' check failed: Authentication required
Service 'userhome' check failed:
org.apache.hadoop.security.AccessControlException: Authentication required
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:987)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1003)
	at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:127)
	at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:125)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)

Trace : Ambari Files View

Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
1 ACCEPTED SOLUTION

avatar
Master Mentor

@Sam Red

Then you have to use

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=* 
hadoop.proxyuser.ambari-server.groups=*
hadoop.proxyuser.ambari-server.hosts=*

View solution in original post

21 REPLIES 21

avatar
Master Mentor

@Sam Red

The "Service 'userhome' check failed" is easy to resolve you need to create the home directory in

# su - hdfs 
$ hdfs dfs -mkdir /user/hive 
$ hdfs dfs -chown hive:hdfs /user/hive

Then retry the hive view.

That should work, let me know.

avatar
New Contributor

If Kerberos is enabled.Try the below steps:-

1)ambari-server setup-security

2)Enter choice, (1-5): 3

3)Enter ambari server's kerberos principal name (ambari@EXAMPLE.COM): ambari-serve-xxx@REALM.COM

4)Enter keytab path for ambari server's kerberos principal:/etc/security/keytabs/ambari.server.keytab

5)Restart ambari server: ambari-server restart.


Do upvote if this article helped.

Ref: https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.1.0/authentication-with-kerberos/content/set_up...