Member since
02-06-2019
3
Posts
1
Kudos Received
0
Solutions
02-21-2019
07:56 AM
1 Kudo
I found the solution for this issue. we can resolve the issue in following two ways. 1) we have to setfacl for 'rwx' permission for the directory from where we loading the data. Or 2) give 777 access permission for the directory from where we loading the data. Option1 is safer because only hive user have 'rwx' permission. In second option all the hdfs users will have 'rwx' permissions.
... View more
02-19-2019
02:24 PM
Here the problem is not with test_hello.txt, moreover it have 777 permission. As i said earlier "/user/hive/warehouse/*" owned by hive:hive and sticky bit was set for that directory. So i couldn't change permissions or move data from another hdfs location to warehouse directory. Please see some of the commands which i ran to see the exact permissions and ACLs. [kkr@myhost ~]$ hdfs dfs -ls /user Found 1 items drwxr-xr-x - kkr kkr 0 2019-02-06 09:58 /user/kkr [kkr@myhost ~]$ hdfs dfs -ls /user/kkr Found 3 items drwx------ - kkr kkr 0 2019-02-06 15:00 /user/kkr/.Trash drwx------ - kkr kkr 0 2019-01-30 15:20 /user/kkr/.staging -rwxrwxrwx 3 kkr kkr 7 2019-01-16 11:33 /user/kkr/test_hello.txt [kkr@myhost ~]$ hdfs dfs -cat /user/kkr/test_hello.txt Hello [kkr@myhost ~]$ hdfs dfs -ls hdfs://nameservice1/user/hive/warehouse/test.db/ Found 2 items drwxrwx--x+ - hive hive 0 2018-08-24 16:53 hdfs://nameservice1/user/hive/warehouse/test.db/test1 drwxrwx--x+ - hive hive 0 2019-02-19 16:26 hdfs://nameservice1/user/hive/warehouse/test.db/testk [kkr@myhost ~]$ hdfs dfs -getfacl /user/hive # file: /user/hive # owner: hive # group: hive # flags: --t user::rwx group::rwx other::r-x /////beeline////////// 0: jdbc:hive2://myhost> show databases; +-----------------+--+ | database_name | +-----------------+--+ | ltc | | test | +-----------------+--+ 2 rows selected (0.181 seconds) 0: jdbc:hive2://myhost> use test; No rows affected (0.091 seconds) 0: jdbc:hive2://myhost> show tables; +-----------------+--+ | tab_name | +-----------------+--+ | test1 | +-----------------+--+ 1 rows selected (0.161 seconds) 0: jdbc:hive2://myhost> CREATE TABLE IF NOT EXISTS testK (userid STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' STORED AS TEXTFILE; INFO : OK 0: jdbc:hive2://myhost> show tables; +-----------------+--+ | tab_name | +-----------------+--+ | test1 | | testk | +-----------------+--+ 2 rows selected (0.174 seconds) 0: jdbc:hive2://myhost> describe testk; INFO : OK +-----------+------------+----------+--+ | col_name | data_type | comment | +-----------+------------+----------+--+ | userid | string | | +-----------+------------+----------+--+ 1 row selected (0.163 seconds) 0: jdbc:hive2://myhost> LOAD DATA INPATH 'hdfs:/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testK; . . INFO : Loading data to table test.testk from hdfs://nameservice1/user/kkr/test_hello.txt ERROR : Failed with exception Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/test.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x Remote Exception: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x . . INFO : org.apache.hadoop.hive.ql.metadata.HiveException: Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/test.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x . . Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x . . Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x . . ERROR : FAILED: Execution Error, return code 20009 from org.apache.hadoop.hive.ql.exec.MoveTask. Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/test.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x INFO : Completed executing command(queryId=hive_20190219165454_8fc3d09c-e085-45fd-9866-184a0699647c); Time taken: 0.046 seconds Error: Error while processing statement: FAILED: Execution Error, return code 20009 from org.apache.hadoop.hive.ql.exec.MoveTask. Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/test.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x (state=42000,code=20009) No rows affected (0.213 seconds)
... View more
02-06-2019
09:46 AM
Hi
I have created a sample table in hive and trying to loaddata using HDFS location but I am getting error that i can't move data to /user/hive/warehouse/* location. This directory owned by hive:hive and sticky bit was set for this location. So i couldn't change permissions or move data from another hdfs location to warehouse directory. Can someone help how can we loaddata manually?(Insert data row by row is working absolutely fine).
1) CREATE TABLE IF NOT EXISTS testk ( name String);
2) LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk;
I am seeing following error/s:
INFO : Compiling command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3): LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk INFO : Semantic Analysis Completed INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) INFO : Completed compiling command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3); Time taken: 0.093 seconds INFO : Executing command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3): LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk INFO : Starting task [Stage-0:MOVE] in serial mode INFO : Loading data to table ml.testk from hdfs://nameservice1/user/kkr/test_hello.txt ERROR : Failed with exception Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x Remote Exception: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:165) at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:4048) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:4018) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3983) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:816) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.rename(AuthorizationProviderProxyClientProtocol.java:271) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:590) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
INFO : org.apache.hadoop.hive.ql.metadata.HiveException: Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x at org.apache.hadoop.hive.ql.metadata.Hive.getHiveException(Hive.java:3048) at org.apache.hadoop.hive.ql.metadata.Hive.getHiveException(Hive.java:3006) at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:3001) at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:3302) at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1761) at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:314) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2052) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1748) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1285) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1280) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:236) at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89) at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:165) at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:4048) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:4018) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3983) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:816) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.rename(AuthorizationProviderProxyClientProtocol.java:271) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:590) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:2028) at org.apache.hadoop.hdfs.DistributedFileSystem.rename(DistributedFileSystem.java:622) at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2989) ... 22 more Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:165) at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:4048) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:4018) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3983) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:816) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.rename(AuthorizationProviderProxyClientProtocol.java:271) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:590) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
at org.apache.hadoop.ipc.Client.call(Client.java:1504) at org.apache.hadoop.ipc.Client.call(Client.java:1441) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) at com.sun.proxy.$Proxy14.rename(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.rename(ClientNamenodeProtocolTranslatorPB.java:492) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) at com.sun.proxy.$Proxy15.rename(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:2026) ... 24 more
ERROR : FAILED: Execution Error, return code 20009 from org.apache.hadoop.hive.ql.exec.MoveTask. Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x INFO : Completed executing command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3); Time taken: 0.032 seconds Error: Error while processing statement: FAILED: Execution Error, return code 20009 from org.apache.hadoop.hive.ql.exec.MoveTask. Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x (state=42000,code=20009) /////////// Can someone let me know the possible options to load the data from HDFS location to hive table (where /user/hive/warehouse not allowing to move data or write data)?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sentry
-
HDFS