28806
DISCUSSIONS
102200
MEMBERS
3161
ARTICLES
CDH 5.15 with Kerberos enabled and Sentry privileges set.
Creating table in Hive from a dataframe in pyspark[1/2] finishes successfully but with a following warning:
>>> df = sqlContext.sql("SELECT * FROM test.tab") >>> df.createOrReplaceTempView("tabView") >>> sqlContext.sql("CREATE TABLE test.tab2 AS SELECT * FROM tabView") setfacl: Permission denied. user=xyz is not the owner of inode=.hive-staging_hive_2018-07-12_08-55-57_578_1630889357367494397-1 18/07/01 12:00:00 WARN shims.HadoopShimsSecure: Unable to inherit permissions for file hdfs://nameservice1/user/hive/warehouse/test.db/tab2/part-00000-92d984c8-cc7d-427b-8381-0a9953186260-c000 from file hdfs://nameservice1/user/hive/warehouse/test.db/tab2 Permission denied. user=xyz is not the owner of inode=part-00000-92d984c8-cc7d-427b-8381-0a9953186260-c000 at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkOwner(DefaultAuthorizationProvider.java:188) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:174) at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:3825) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:6784) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setAcl(FSNamesystem.java:9296) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setAcl(NameNodeRpcServer.java:1642) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.setAcl(AuthorizationProviderProxyClientProtocol.java:902) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setAcl(ClientNamenodeProtocolServerSideTranslatorPB.java:1347) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) DataFrame[]
Possibly running into HDFS-6962 (https://issues.apache.org/jira/browse/HDFS-6962)
Setting dfs.namenode.posix.acl.inheritance.enabled to true in hdfs-site.xml safety valves (DataNode, NameNode & Client) of HDFS service has not solved the issue.