"hadoop.security.authentication" to "simple" in the file /etc/hadoop/conf/core-site.xml in ambari.
'hive.server2.authentication' is set to 'None'
But while i am able to open the views but node managers and datanodes gives error to kerberos..as below:
2017-01-05 16:12:22,661 WARN authorize.ServiceAuthorizationManager (ServiceAuthorizationManager.java:authorize(119)) - Authorization failed for yarn (auth:SIMPLE) for protocol=interface org.apache.hadoop.yarn.server.api.ResourceTrackerPB, expected client Kerberos principal is nm/hdpdn1.hadoop.com@HADOOP.COM
2017-01-05 16:12:22,661 INFO ipc.Server (Server.java:authorizeConnection(2039)) - Connection from 192.168.56.41:34702 for protocol org.apache.hadoop.yarn.server.api.ResourceTrackerPB is unauthorized for user yarn (auth:SIMPLE)
2017-01-05 16:12:22,661 INFO ipc.Server (Server.java:doRead(850)) - Socket Reader #1 for port 8025: readAndProcess from client 192.168.56.41 threw exception [org.apache.hadoop.security.authorize.AuthorizationException: User yarn (auth:SIMPLE) is not authorized for protocol interface org.apache.hadoop.yarn.server.api.ResourceTrackerPB, expected client Kerberos principal is nm/hdpdn1.hadoop.com@HADOOP.COM]
Please help me to have kerberize cluster and views.
You dont need to change hadoop properties in core-site.xml. Problem is not with hadoop components, it is with hive/files view settings that needs to be modified to set webhdfs authentication to auth=KERBEROS,proxyuser=<proxyuser>
Refer below doc about how to configure Ambari views if cluster is kerberized.