<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question kerberos ticket not working after I enabled SOLR audits in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140588#M48125</link>
    <description>&lt;P&gt;this used to work , not anymore &lt;/P&gt;&lt;PRE&gt;[hive@hadoop1 ~]$ kdestroy
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ id
uid=1004(hive) gid=501(hadoop) groups=501(hadoop)
[hive@hadoop1 ~]$ klist
klist: No credentials cache found (ticket cache FILE:/tmp/krb5cc_1004)
[hive@hadoop1 ~]$ kinit hive
Password for hive@MY.COM:
[hive@hadoop1 ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_1004
Default principal: hive@MY.COM
Valid starting     Expires            Service principal
12/06/16 17:04:14  12/07/16 17:04:14  krbtgt/MY.COM@MY.COM
        renew until 12/06/16 17:04:14
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ beeline -u 'jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM' -f b.sql
Connecting to jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)
Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop2:10000/default&amp;gt; show tables;
Error: Error while compiling statement: FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d---------
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
        at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)
) (state=42000,code=40000)
Closing: 0: jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
[hive@hadoop1 ~]$

&lt;/PRE&gt;</description>
    <pubDate>Wed, 07 Dec 2016 06:07:34 GMT</pubDate>
    <dc:creator>aliyesami</dc:creator>
    <dc:date>2016-12-07T06:07:34Z</dc:date>
    <item>
      <title>kerberos ticket not working after I enabled SOLR audits</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140588#M48125</link>
      <description>&lt;P&gt;this used to work , not anymore &lt;/P&gt;&lt;PRE&gt;[hive@hadoop1 ~]$ kdestroy
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ id
uid=1004(hive) gid=501(hadoop) groups=501(hadoop)
[hive@hadoop1 ~]$ klist
klist: No credentials cache found (ticket cache FILE:/tmp/krb5cc_1004)
[hive@hadoop1 ~]$ kinit hive
Password for hive@MY.COM:
[hive@hadoop1 ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_1004
Default principal: hive@MY.COM
Valid starting     Expires            Service principal
12/06/16 17:04:14  12/07/16 17:04:14  krbtgt/MY.COM@MY.COM
        renew until 12/06/16 17:04:14
[hive@hadoop1 ~]$
[hive@hadoop1 ~]$ beeline -u 'jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM' -f b.sql
Connecting to jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)
Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop2:10000/default&amp;gt; show tables;
Error: Error while compiling statement: FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d---------
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
        at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3972)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1130)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)
) (state=42000,code=40000)
Closing: 0: jdbc:hive2://hadoop2:10000/default;principal=hive/hadoop2@MY.COM
[hive@hadoop1 ~]$

&lt;/PRE&gt;</description>
      <pubDate>Wed, 07 Dec 2016 06:07:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140588#M48125</guid>
      <dc:creator>aliyesami</dc:creator>
      <dc:date>2016-12-07T06:07:34Z</dc:date>
    </item>
    <item>
      <title>Re: kerberos ticket not working after I enabled SOLR audits</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140589#M48126</link>
      <description>&lt;P&gt;Looking at the error Permission denied: user=hive, access=EXECUTE, inode="/apps/hive/warehouse":hdfs:hdfs:d--------- it is hdfs which is doing the Authorization check and not allowing. Do you have the policy for the use "hive" if not create one with read, write and execute permission as your hdfs level permission is 000&lt;/P&gt;</description>
      <pubDate>Wed, 07 Dec 2016 06:50:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140589#M48126</guid>
      <dc:creator>rmani</dc:creator>
      <dc:date>2016-12-07T06:50:09Z</dc:date>
    </item>
    <item>
      <title>Re: kerberos ticket not working after I enabled SOLR audits</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140590#M48127</link>
      <description>&lt;P&gt;as &lt;A rel="user" href="https://community.cloudera.com/users/218/rmani.html" nodeid="218"&gt;@Ramesh Mani&lt;/A&gt; mentioned, this seems to be more authorization related. For a quick fix, try to assigning read permissions the hdfs level (hadoop fs -chmod 755 /apps/hive/warehouse). &lt;/P&gt;&lt;P&gt;For a more valid way of doing it, go to ranger and go to your hdfs policies and make sure you have the proper permissions for hive user to access the said directory.&lt;/P&gt;</description>
      <pubDate>Wed, 07 Dec 2016 07:56:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/kerberos-ticket-not-working-after-I-enabled-SOLR-audits/m-p/140590#M48127</guid>
      <dc:creator>ed_gleeck</dc:creator>
      <dc:date>2016-12-07T07:56:25Z</dc:date>
    </item>
  </channel>
</rss>

