Reply
KKR
New Contributor
Posts: 1
Registered: ‎02-06-2019

Couldn't load data to hive table.

[ Edited ]

Hi 

I have created a sample table in hive and trying to loaddata using HDFS location but I am getting error that i can't move data to /user/hive/warehouse/* location. This directory owned by hive:hive and sticky bit was set for this location. So i couldn't change permissions or move data from another hdfs location to warehouse directory. Can someone help how can we loaddata manually?(Insert data row by row is working absolutely fine).

 

1) CREATE TABLE IF NOT EXISTS testk ( name String);

2)  LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk;

 

I am seeing following error/s:

INFO : Compiling command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3): LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk
INFO : Semantic Analysis Completed
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3); Time taken: 0.093 seconds
INFO : Executing command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3): LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk
INFO : Starting task [Stage-0:MOVE] in serial mode
INFO : Loading data to table ml.testk from hdfs://nameservice1/user/kkr/test_hello.txt
ERROR : Failed with exception Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
Remote Exception: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:165)
at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:4048)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:4018)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3983)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:816)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.rename(AuthorizationProviderProxyClientProtocol.java:271)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:590)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)

INFO : org.apache.hadoop.hive.ql.metadata.HiveException: Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
at org.apache.hadoop.hive.ql.metadata.Hive.getHiveException(Hive.java:3048)
at org.apache.hadoop.hive.ql.metadata.Hive.getHiveException(Hive.java:3006)
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:3001)
at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:3302)
at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1761)
at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:314)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2052)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1748)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1285)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1280)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:236)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89)
at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:165)
at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:4048)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:4018)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3983)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:816)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.rename(AuthorizationProviderProxyClientProtocol.java:271)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:590)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:2028)
at org.apache.hadoop.hdfs.DistributedFileSystem.rename(DistributedFileSystem.java:622)
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2989)
... 22 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:165)
at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:4048)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:4018)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3983)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:816)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.rename(AuthorizationProviderProxyClientProtocol.java:271)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:590)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)

at org.apache.hadoop.ipc.Client.call(Client.java:1504)
at org.apache.hadoop.ipc.Client.call(Client.java:1441)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
at com.sun.proxy.$Proxy14.rename(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.rename(ClientNamenodeProtocolTranslatorPB.java:492)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy15.rename(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:2026)
... 24 more

ERROR : FAILED: Execution Error, return code 20009 from org.apache.hadoop.hive.ql.exec.MoveTask. Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
INFO : Completed executing command(queryId=hive_20190206123838_18df2dde-e890-455d-ac80-baf15b2f22e3); Time taken: 0.032 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 20009 from org.apache.hadoop.hive.ql.exec.MoveTask. Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x (state=42000,code=20009)

///////////
Can someone let me know the possible options to load the data from HDFS location to hive table (where /user/hive/warehouse not allowing to move data or write data)?

New Contributor
Posts: 4
Registered: ‎11-20-2018

Re: Couldn't load data to hive table.

Is cluster kerberoized? if yes then your user should be the part of group. or you can create table and load the data using below cmd.

 

su - hdfs

 

CREATE TABLE IF NOT EXISTS testk ( name String);

LOAD DATA INPATH '/user/kkr/test_hello.txt' OVERWRITE INTO TABLE testk;

 

New Contributor
Posts: 3
Registered: ‎02-17-2019

Re: Couldn't load data to hive table.

 

ERROR : Failed with exception Access denied: Unable to move source hdfs://nameservice1/user/kkr/test_hello.txt to destination hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x
Remote Exception: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x

Use the following code to check the permissions. Maybe you should add more to user kkr

hadoop fs -ls /user/kkr/

 

Announcements