Support Questions

Find answers, ask questions, and share your expertise

ambari Views fail after Kerberos Enabled.

Hi,

I am not able to view any ambari views except Yarn after kerberos enabled. I dont have any proxy users setup and just have ambari server.

Any suggestions Please :

How to configure after kerberos enabled:

Hive View :

Issues detected
Service 'ats' check failed: Server ErrorService 'userhome' check failed: Authentication required
Service 'userhome' check failed:
org.apache.hadoop.security.AccessControlException: Authentication required
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:608)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:987)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1003)
	at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:127)
	at org.apache.ambari.view.utils.hdfs.HdfsApi$3.run(HdfsApi.java:125)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)

Trace : Ambari Files View

Authentication required
org.apache.hadoop.security.AccessControlException: Authentication required
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:612)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
1 ACCEPTED SOLUTION

Mentor

@Sam Red

Then you have to use

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=* 
hadoop.proxyuser.ambari-server.groups=*
hadoop.proxyuser.ambari-server.hosts=*

View solution in original post

21 REPLIES 21

Mentor

@Sam Red

1.You will need to create a user home for the one running the view in this case admin as root

# su - hdfs
$hdfs dfs -mkdir /user/admin
$hdfs dfs -chown admin:hdfs /user/admin

That should resolve the "'userhome' check failed: Authentication required "

2. For the kerberos authentication for both hive and the file, view create a new view and l

WebHDFS Authentication = auth=KERBEROS;proxyuser=ambari-server-$cluster@REALM

To configure the views hive/files create a new view see below example

Instance name = Test2

Displayname =Test2

Description =Test Files or hive

Keep all the other parameters as is and only change the WebHDFS Authentication see attached screenshot the value for proxyuser should be the values obtained in the screenshot 1 using the filter ambari-server for hive and hdfs are identical

Please let me know if that helped


red.png

@Geoffrey Shelton Okot Thank you again.

After i added all your steps still i am getting same issue.

34618-hive-view.jpg

@Geoffrey Shelton Okot

FilesView :

Local cluster :

org.apache.hadoop.security.AccessControlException: Authentication required
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:457)
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:113)
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:738)
 at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:582)
 at org.apache.hadoop.hdfs.web.W...

34619-filesview.jpg

Mentor

@Sam Red

Whats the values of the ambari-server filter for hive and hdfs

HIVE

webhcat.proxyuser.ambari-server-xxxx.groups=*  
webhcat.proxyuser.ambari-server-xxxx.hosts=* 

HDFS

hadoop.proxyuser.hcat.groups=*  
hadoop.proxyuser.hcat.hosts=*  
hadoop.proxyuser.hdfs.groups=*  
hadoop.proxyuser.hdfs.hosts=*

Revert

@Geoffrey Shelton Okot

Eveything is *

Mentor

@Sam Red

What is the version of your hive and files?

@Geoffrey Shelton Okot

Files View : 1.0.0

Hive View : 1.5.0

Mentor

@Sam Red

Can you also check this values

If you are running Ambari Server as root user then add below properties

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=*

.If you are running Ambari server as non-root user then please add below properties in core-site.xml

hadoop.proxyuser.<ambari-server-user>.groups=*
hadoop.proxyuser.<ambari-server-user>.hosts=*

Please replace <ambari-server-user> with user running Ambari Server in above example.

.

Your ambari server principal is ambari-server@REALM.COM, if not then please replace 'ambari-server' with your principal's user part.

hadoop.proxyuser.ambari-server.groups=*hadoop.proxyuser.ambari-server.hosts=*

.

We already created user directory on hdfs for the user accessing hive view admin user to access hive and hdfs view.

.As hdfs user we did already the first 2 run the last one chmod

$ hdfs hadoop fs -mkdir /user/admin
$ hdfs hadoop fs -chown admin:hdfs /user/admin
$ hdfs hadoop fs -chmod 755/user/admin

Revert

@Geoffrey Shelton Okot

I am using as admin in ambari server.

After I added all these properties in Hdfs and ran again but issue is still same.

Mentor

@Sam Red

Here we are talking of the user running the ambari processes check like below

# ls -al /etc/ambari-server/conf/

Regards

@Geoffrey Shelton Okot
total 28
drwxr-xr-x 2 root root  131 Aug 29 11:49 .
drwxr-xr-x 3 root root   18 Aug  1 22:37 ..
-rwxrwxrwx 1 root root 6824 Aug 24 13:02 ambari.properties
-rwxrwxrwx 1 root root  311 Aug 29 11:49 krb5JAASLogin.conf
-rw-r--r-- 1 root root  286 Aug 29 11:49 krb5JAASLogin.conf.bak
-rwxrwxrwx 1 root root 4929 Aug  1 22:37 log4j.properties
-rw-r----- 1 root root    7 Aug  1 22:47 password.dat



Mentor

@Sam Red

Then you have to use

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=* 
hadoop.proxyuser.ambari-server.groups=*
hadoop.proxyuser.ambari-server.hosts=*

Mentor

@Sam Red

The bold part obscured above should be part Authentication part in the File/hive view

hadoop.proxyuser.ambari-server-xxxx.hosts

hadoop.proxyuser.ambari-server-xxxx.groups

In the part of the views

WebHDFS Authentication : auth=KERBEROS;proxyuser=ambari-server-xxxx@REALM

@Geoffrey Shelton Okot

After restartred ambari server got new issue :

Service 'hdfs' check failed:
java.lang.NullPointerException
	at org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:383)
	at org.apache.hadoop.security.User.<init>(User.java:48)
	at org.apache.hadoop.security.User.<init>(User.java:43)
	at org.apache.hadoop.security.UserGroupInformation.createRemoteUser(UserGroupInformation.java:1270)
	at org.apache.hadoop.security.UserGroupInformation.createRemoteUser(UserGroupInformation.java:1254)
	at org.apache.ambari.view.utils.hdfs.HdfsApi.getProxyUser(HdfsApi.java:78)
	at org.apache.ambari.view.utils.hdfs.HdfsApi.<init>(HdfsApi.java:66)
	at org.apache.ambari.view.utils.hdfs.HdfsUtil.connectToHDFSApi(HdfsUtil.java:127)
	at org.apache.ambari.view.commons.hdfs.HdfsService.hdfsSmokeTest(HdfsService.java:136)
	at org.apache.ambari.view.filebrowser.HelpService.hdfsStatus(HelpService.java:86)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)

Mentor

@Sam Red

2 things to do can you first restart the cluster and see if it persists.

What is the value for WebHDFS Authentication in the views ?

From the stack trace the problem is due to the mapping from full Kerberos principal name to short username. This mapping is driven by the following configuration property in core-site.xml.

<property>
  <name>hadoop.security.auth_to_local</name>
  <value></value>
  <description>Maps kerberos principals to local user names</description>
</property>

Please revert

@Geoffrey Shelton Okot

I don't know what is the issue. did lot of research but still issue persist.

Webhdfs : auth=KERBEROS;proxyuser=ambari-server-abc_bigpipeline@RELAY.COM

Mentor

@Sam Red

I have also been trying to understand what is wrong. What's this command's output?

# klist -kt /etc/security/keytabs/ambari.server.keytab
keytab name: FILE:/etc/security/keytabs/ambari.server.keytab
KVNO           Timestamp           Principal
---- ------------------- ------------------------------------------------------
   1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
   1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
   1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
   1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
   1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM

Then grab a valid Kerberos ticket

$ kinit -kt /etc/security/keytabs/ambari.server.keytab   ambari-server-abc_bigxxxline@ROMAT.COM

Then try accessing then retry.

@Geoffrey Shelton Okot

Thank You. after lot of edits i am able to open FilesView But not Hive View.

Issues detected
Service 'hdfs' check failed: E090 NullPointerException
Service 'userhome' check failed: HdfsApi connection failed. Check "webhdfs.url" property

Mentor

@Sam Red

The "Service 'userhome' check failed" is easy to resolve you need to create the home directory in

# su - hdfs 
$ hdfs dfs -mkdir /user/hive 
$ hdfs dfs -chown hive:hdfs /user/hive

Then retry the hive view.

That should work, let me know.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.