Support Questions

Find answers, ask questions, and share your expertise

kerberos ticket not working after I enabled SOLR audits

avatar
Super Collaborator

this used to work , not anymore

[hive@hadoop1 ~]$ kdestroy
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ id
uid=1004(hive) gid=501(hadoop) groups=501(hadoop)
[hive@hadoop1 ~]$ klist
klist: No credentials cache found (ticket cache FILE:/tmp/krb5cc_1004)
[hive@hadoop1 ~]$ kinit hive
Password for hive@MY.COM:
[hive@hadoop1 ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_1004
Default principal: hive@MY.COM
Valid starting     Expires            Service principal
12/06/16 17:04:14  12/07/16 17:04:14  krbtgt/MY.COM@MY.COM
        renew until 12/06/16 17:04:14
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ beeline -u 'jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM' -f b.sql
Connecting to jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)
Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop2:10000/default> show tables;
Error: Error while compiling statement: FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d---------
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
        at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)
) (state=42000,code=40000)
Closing: 0: jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
[hive@hadoop1 ~]$

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Looking at the error Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d--------- it is hdfs which is doing the Authorization check and not allowing. Do you have the policy for the use "hive" if not create one with read, write and execute permission as your hdfs level permission is 000

View solution in original post

2 REPLIES 2

avatar
Super Collaborator

Looking at the error Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d--------- it is hdfs which is doing the Authorization check and not allowing. Do you have the policy for the use "hive" if not create one with read, write and execute permission as your hdfs level permission is 000

avatar
Rising Star

as @Ramesh Mani mentioned, this seems to be more authorization related. For a quick fix, try to assigning read permissions the hdfs level (hadoop fs -chmod 755 /apps/hive/warehouse).

For a more valid way of doing it, go to ranger and go to your hdfs policies and make sure you have the proper permissions for hive user to access the said directory.