Created 08-31-2017 05:45 PM
Hi,
I am not able to view any ambari views except Yarn after kerberos enabled. I dont have any proxy users setup and just have ambari server.
Any suggestions Please :
How to configure after kerberos enabled:
Hive View :
Issues detected Service 'ats' check failed: Server ErrorService 'userhome' check failed: Authentication required
Service 'userhome' check failed: org.apache.hadoop.security.AccessControlException: Authentication required at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:987) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1003) at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:127) at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:125) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422)
Trace : Ambari Files View
Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
Created 08-31-2017 09:54 PM
Then you have to use
hadoop.proxyuser.root.groups=* hadoop.proxyuser.root.hosts=*
hadoop.proxyuser.ambari-server.groups=* hadoop.proxyuser.ambari-server.hosts=*
Created 08-31-2017 07:36 PM
1.You will need to create a user home for the one running the view in this case admin as root
# su - hdfs $hdfs dfs -mkdir /user/admin $hdfs dfs -chown admin:hdfs /user/admin
That should resolve the "'userhome' check failed: Authentication required "
2. For the kerberos authentication for both hive and the file, view create a new view and l
WebHDFS Authentication = auth=KERBEROS;proxyuser=ambari-server-$cluster@REALM
To configure the views hive/files create a new view see below example
Instance name = Test2
Displayname =Test2
Description =Test Files or hive
Keep all the other parameters as is and only change the WebHDFS Authentication see attached screenshot the value for proxyuser should be the values obtained in the screenshot 1 using the filter ambari-server for hive and hdfs are identical
Please let me know if that helped
Created on 08-31-2017 08:39 PM - edited 08-17-2019 05:35 PM
Created on 08-31-2017 08:42 PM - edited 08-17-2019 05:35 PM
FilesView :
Local cluster :
org.apache.hadoop.security.AccessControlException: Authentication required at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582) at org.apache.hadoop.hdfs.web.W...
Created 08-31-2017 08:54 PM
Whats the values of the ambari-server filter for hive and hdfs
HIVE
webhcat.proxyuser.ambari-server-xxxx.groups=* webhcat.proxyuser.ambari-server-xxxx.hosts=*
HDFS
hadoop.proxyuser.hcat.groups=* hadoop.proxyuser.hcat.hosts=* hadoop.proxyuser.hdfs.groups=* hadoop.proxyuser.hdfs.hosts=*
Revert
Created 08-31-2017 09:22 PM
Eveything is *
Created 08-31-2017 09:14 PM
What is the version of your hive and files?
Created 08-31-2017 09:18 PM
Created 08-31-2017 09:29 PM
Can you also check this values
If you are running Ambari Server as root user then add below properties
hadoop.proxyuser.root.groups=* hadoop.proxyuser.root.hosts=*
.If you are running Ambari server as non-root user then please add below properties in core-site.xml
hadoop.proxyuser.<ambari-server-user>.groups=* hadoop.proxyuser.<ambari-server-user>.hosts=*
Please replace <ambari-server-user> with user running Ambari Server in above example.
.
Your ambari server principal is ambari-server@REALM.COM, if not then please replace 'ambari-server' with your principal's user part.
hadoop.proxyuser.ambari-server.groups=*hadoop.proxyuser.ambari-server.hosts=*
.
.As hdfs user we did already the first 2 run the last one chmod
$ hdfs hadoop fs -mkdir /user/admin $ hdfs hadoop fs -chown admin:hdfs /user/admin $ hdfs hadoop fs -chmod 755/user/admin
Revert
Created 08-31-2017 09:42 PM
I am using as admin in ambari server.
After I added all these properties in Hdfs and ran again but issue is still same.