Created 08-31-2017 05:45 PM
Hi,
I am not able to view any ambari views except Yarn after kerberos enabled. I dont have any proxy users setup and just have ambari server.
Any suggestions Please :
How to configure after kerberos enabled:
Hive View :
Issues detected Service 'ats' check failed: Server ErrorService 'userhome' check failed: Authentication required
Service 'userhome' check failed: org.apache.hadoop.security.AccessControlException: Authentication required at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:987) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1003) at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:127) at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:125) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422)
Trace : Ambari Files View
Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
Created 08-31-2017 09:54 PM
Then you have to use
hadoop.proxyuser.root.groups=* hadoop.proxyuser.root.hosts=*
hadoop.proxyuser.ambari-server.groups=* hadoop.proxyuser.ambari-server.hosts=*
Created 09-01-2017 09:05 PM
The "Service 'userhome' check failed" is easy to resolve you need to create the home directory in
# su - hdfs $ hdfs dfs -mkdir /user/hive $ hdfs dfs -chown hive:hdfs /user/hive
Then retry the hive view.
That should work, let me know.
Created 05-16-2019 06:46 PM
If Kerberos is enabled.Try the below steps:-
1)ambari-server setup-security
2)Enter choice, (1-5): 3
3)Enter ambari server's kerberos principal name (ambari@EXAMPLE.COM): ambari-serve-xxx@REALM.COM
4)Enter keytab path for ambari server's kerberos principal:/etc/security/keytabs/ambari.server.keytab
5)Restart ambari server: ambari-server restart.
Do upvote if this article helped.