Support Questions

Find answers, ask questions, and share your expertise

[HDFS] ACL : failed in hive after upgrade HDP 2.4 to 2.5

avatar
Rising Star

hi all,

i have an issue after hdp upgrade, have you an idea what's happen ?

134746 [main] -5p org.apache.hadoop.hive.ql.exec.FileSinkOperator  - Moving tmp dir: hdfs://hadoop/user/hive/testing/w1_log/.hive-staging_hive_2017-06-07_10-53-45_227_7234037059148387215-1/_tmp.-ext-10000 to: hdfs://hadoop/user/hive/testing/w1_log/.hive-staging_hive_2017-06-07_10-53-45_227_7234037059148387215-1/-ext-10000
134746 [main] -5p org.apache.hadoop.hive.ql.exec.FileSinkOperator  - Moving tmp dir: hdfs://hadoop/user/hive/testing/w1_log/.hive-staging_hive_2017-06-07_10-53-45_227_7234037059148387215-1/_tmp.-ext-10000 to: hdfs://hadoop/user/hive/testing/w1_log/.hive-staging_hive_2017-06-07_10-53-45_227_7234037059148387215-1/-ext-10000
134750 [main] -5p org.apache.hadoop.hive.ql.exec.Task  - Loading data to table testing.w1_log partition (year=null, month=null, day=null, hour=null) from hdfs://hadoop/user/hive/testing/w1_log/.hive-staging_hive_2017-06-07_10-53-45_227_7234037059148387215-1/-ext-10000
134750 [main] -5p org.apache.hadoop.hive.ql.exec.Task  - Loading data to table testing.w1_log partition (year=null, month=null, day=null, hour=null) from hdfs://hadoop/user/hive/testing/w1_log/.hive-staging_hive_2017-06-07_10-53-45_227_7234037059148387215-1/-ext-10000
134776 [main] -5p org.apache.hadoop.hive.ql.exec.MoveTask  - Partition is: {year=null, month=null, day=null, hour=null}
134776 [main] -5p org.apache.hadoop.hive.ql.exec.MoveTask  - Partition is: {year=null, month=null, day=null, hour=null}
135464 [main] -5p hive.ql.metadata.Hive  - Failed to move: org.apache.hadoop.hdfs.protocol.AclException: Invalid ACL: only directories may have a default ACL.
    at org.apache.hadoop.hdfs.server.namenode.AclStorage.updateINodeAcl(AclStorage.java:282)
    at org.apache.hadoop.hdfs.server.namenode.FSDirAclOp.unprotectedSetAcl(FSDirAclOp.java:208)
    at org.apache.hadoop.hdfs.server.namenode.FSDirAclOp.setAcl(FSDirAclOp.java:146)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setAcl(FSNamesystem.java:8456)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setAcl(NameNodeRpcServer.java:1968)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setAcl(ClientNamenodeProtocolServerSideTranslatorPB.java:1338)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)

135464 [main] -5p hive.ql.metadata.Hive  - Failed to move: org.apache.hadoop.hdfs.protocol.AclException: Invalid ACL: only directories may have a default ACL.

[~]$ hdfs dfs -getfacl /user/hive/testing/w1_log
log4j:ERROR Could not connect to remote log4j server at [localhost]. We will try again later.
# file: /user/hive/testing/w1_log
# owner: testing
# group: testing
user::rwx
user:hive:rwx
group::rwx
mask::rwx
other::r-x
default:user::rwx
default:user:hive:rwx
default:group::rwx
default:mask::rwx
default:other::r-x

2 REPLIES 2

avatar
Rising Star

Anybody can help me ?

avatar
Rising Star

Did you check the option "dfs.namenode.acls.enabled=true"?