Support Questions

Find answers, ask questions, and share your expertise

ambari Views fail after Kerberos Enabled.

avatar
Contributor

Hi,

I am not able to view any ambari views except Yarn after kerberos enabled. I dont have any proxy users setup and just have ambari server.

Any suggestions Please :

How to configure after kerberos enabled:

Hive View :

Issues detected
Service 'ats' check failed: Server ErrorService 'userhome' check failed: Authentication required
Service 'userhome' check failed:
org.apache.hadoop.security.AccessControlException: Authentication required
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:987)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1003)
	at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:127)
	at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:125)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)

Trace : Ambari Files View

Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
1 ACCEPTED SOLUTION

avatar
Master Mentor

@Sam Red

Then you have to use

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=* 
hadoop.proxyuser.ambari-server.groups=*
hadoop.proxyuser.ambari-server.hosts=*

View solution in original post

21 REPLIES 21

avatar
Master Mentor

@Sam Red

1.You will need to create a user home for the one running the view in this case admin as root

# su - hdfs
$hdfs dfs -mkdir /user/admin
$hdfs dfs -chown admin:hdfs /user/admin

That should resolve the "'userhome' check failed: Authentication required "

2. For the kerberos authentication for both hive and the file, view create a new view and l

WebHDFS Authentication = auth=KERBEROS;proxyuser=ambari-server-$cluster@REALM

To configure the views hive/files create a new view see below example

Instance name = Test2

Displayname =Test2

Description =Test Files or hive

Keep all the other parameters as is and only change the WebHDFS Authentication see attached screenshot the value for proxyuser should be the values obtained in the screenshot 1 using the filter ambari-server for hive and hdfs are identical

Please let me know if that helped


red.png

avatar
Contributor

@Geoffrey Shelton Okot Thank you again.

After i added all your steps still i am getting same issue.

34618-hive-view.jpg

avatar
Contributor

@Geoffrey Shelton Okot

FilesView :

Local cluster :

org.apache.hadoop.security.AccessControlException: Authentication required
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
 at org.apache.hadoop.hdfs.web.W...

34619-filesview.jpg

avatar
Master Mentor

@Sam Red

Whats the values of the ambari-server filter for hive and hdfs

HIVE

webhcat.proxyuser.ambari-server-xxxx.groups=*  
webhcat.proxyuser.ambari-server-xxxx.hosts=* 

HDFS

hadoop.proxyuser.hcat.groups=*  
hadoop.proxyuser.hcat.hosts=*  
hadoop.proxyuser.hdfs.groups=*  
hadoop.proxyuser.hdfs.hosts=*

Revert

avatar
Contributor
@Geoffrey Shelton Okot

Eveything is *

avatar
Master Mentor

@Sam Red

What is the version of your hive and files?

avatar
Contributor

@Geoffrey Shelton Okot

Files View : 1.0.0

Hive View : 1.5.0

avatar
Master Mentor

@Sam Red

Can you also check this values

If you are running Ambari Server as root user then add below properties

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=*

.If you are running Ambari server as non-root user then please add below properties in core-site.xml

hadoop.proxyuser.<ambari-server-user>.groups=*
hadoop.proxyuser.<ambari-server-user>.hosts=*

Please replace <ambari-server-user> with user running Ambari Server in above example.

.

Your ambari server principal is ambari-server@REALM.COM, if not then please replace 'ambari-server' with your principal's user part.

hadoop.proxyuser.ambari-server.groups=*hadoop.proxyuser.ambari-server.hosts=*

.

We already created user directory on hdfs for the user accessing hive view admin user to access hive and hdfs view.

.As hdfs user we did already the first 2 run the last one chmod

$ hdfs hadoop fs -mkdir /user/admin
$ hdfs hadoop fs -chown admin:hdfs /user/admin
$ hdfs hadoop fs -chmod 755/user/admin

Revert

avatar
Contributor

@Geoffrey Shelton Okot

I am using as admin in ambari server.

After I added all these properties in Hdfs and ran again but issue is still same.