<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Error while populating HIVE table with HDFS data in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-while-populating-HIVE-table-with-HDFS-data/m-p/171345#M25409</link>
    <description>&lt;P&gt;Hello there,&lt;/P&gt;&lt;P&gt;I am getting the error below while trying to populate a HIVE table from the HIVE view with an HDFS file:&lt;/P&gt;&lt;P&gt;LOAD DATA INPATH '/employees/part-m-00000' OVERWRITE INTO TABLE temp_employees;&lt;/P&gt;&lt;P&gt;The part-m-00000 file is the result of successful Sqoop import of a MySQL table into HDFS.&lt;/P&gt;&lt;P&gt;I suspect this is a permission issue, but I am not sure where to go to fix it.&lt;/P&gt;&lt;P&gt;INFO : Loading data to table ambari.temp_employees from hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/employees/part-m-00000
ERROR : Failed with exception Unable to move source hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/employees/part-m-00000 to destination hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/apps/hive/warehouse/ambari.db/temp_employees/part-m-00000
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/employees/part-m-00000 to destination hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/apps/hive/warehouse/ambari.db/temp_employees/part-m-00000
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2692)
at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:2940)
at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1659)
at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:298)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1720)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1477)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1254)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71)
at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=admin, access=WRITE, inode="/employees/part-m-00000":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:216)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameTo(FSDirRenameOp.java:459)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameToInt(FSDirRenameOp.java:73)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3661)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:932)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:575)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:1961)
at org.apache.hadoop.hdfs.DistributedFileSystem.rename(DistributedFileSystem.java:636)
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2684)
... 22 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=admin, access=WRITE, inode="/employees/part-m-00000":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:216)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameTo(FSDirRenameOp.java:459)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameToInt(FSDirRenameOp.java:73)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3661)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:932)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:575)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)

at org.apache.hadoop.ipc.Client.call(Client.java:1427)
at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy15.rename(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.rename(ClientNamenodeProtocolTranslatorPB.java:487)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy16.rename(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:1959)&lt;/P&gt;&lt;P&gt;Any help appreciated. &lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;Wellington &lt;/P&gt;</description>
    <pubDate>Mon, 18 Apr 2016 04:12:06 GMT</pubDate>
    <dc:creator>junnnninho</dc:creator>
    <dc:date>2016-04-18T04:12:06Z</dc:date>
    <item>
      <title>Error while populating HIVE table with HDFS data</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-while-populating-HIVE-table-with-HDFS-data/m-p/171345#M25409</link>
      <description>&lt;P&gt;Hello there,&lt;/P&gt;&lt;P&gt;I am getting the error below while trying to populate a HIVE table from the HIVE view with an HDFS file:&lt;/P&gt;&lt;P&gt;LOAD DATA INPATH '/employees/part-m-00000' OVERWRITE INTO TABLE temp_employees;&lt;/P&gt;&lt;P&gt;The part-m-00000 file is the result of successful Sqoop import of a MySQL table into HDFS.&lt;/P&gt;&lt;P&gt;I suspect this is a permission issue, but I am not sure where to go to fix it.&lt;/P&gt;&lt;P&gt;INFO : Loading data to table ambari.temp_employees from hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/employees/part-m-00000
ERROR : Failed with exception Unable to move source hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/employees/part-m-00000 to destination hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/apps/hive/warehouse/ambari.db/temp_employees/part-m-00000
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/employees/part-m-00000 to destination hdfs://ip-172-31-33-63.sa-east-1.compute.internal:8020/apps/hive/warehouse/ambari.db/temp_employees/part-m-00000
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2692)
at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:2940)
at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1659)
at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:298)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1720)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1477)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1254)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71)
at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=admin, access=WRITE, inode="/employees/part-m-00000":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:216)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameTo(FSDirRenameOp.java:459)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameToInt(FSDirRenameOp.java:73)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3661)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:932)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:575)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:1961)
at org.apache.hadoop.hdfs.DistributedFileSystem.rename(DistributedFileSystem.java:636)
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2684)
... 22 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=admin, access=WRITE, inode="/employees/part-m-00000":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:216)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameTo(FSDirRenameOp.java:459)
at org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameToInt(FSDirRenameOp.java:73)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3661)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:932)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename(ClientNamenodeProtocolServerSideTranslatorPB.java:575)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)

at org.apache.hadoop.ipc.Client.call(Client.java:1427)
at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy15.rename(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.rename(ClientNamenodeProtocolTranslatorPB.java:487)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy16.rename(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:1959)&lt;/P&gt;&lt;P&gt;Any help appreciated. &lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;Wellington &lt;/P&gt;</description>
      <pubDate>Mon, 18 Apr 2016 04:12:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-while-populating-HIVE-table-with-HDFS-data/m-p/171345#M25409</guid>
      <dc:creator>junnnninho</dc:creator>
      <dc:date>2016-04-18T04:12:06Z</dc:date>
    </item>
    <item>
      <title>Re: Error while populating HIVE table with HDFS data</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-while-populating-HIVE-table-with-HDFS-data/m-p/171346#M25410</link>
      <description>&lt;P&gt;Yes, you have a permission problem on your input file:&lt;/P&gt;&lt;PRE&gt;Permission denied: user=admin, access=WRITE, inode="/employees/part-m-00000":hdfs:hdfs:drwxr-xr-x&lt;/PRE&gt;&lt;P&gt;As an immediate remedy you can change permissions, for example:&lt;/P&gt;&lt;PRE&gt;su - hdfs -c "hdfs dfs -chmod -R +w /employees/"&lt;/PRE&gt;&lt;P&gt;Long term, it's the best to run all you commands using an end-user account (not hdfs, root, admin etc.), In Sandbox-2.4 you can use a user called "maria-dev". So when you run sqoop, do "su - maria_dev" first, and run your commands, and when you use Ambari views, login into Ambari also as maria_dev. In this way you can avoid permissions issues.&lt;/P&gt;&lt;P&gt;Edit: Before doing "su - maria_dev", create the user "maria_dev" on the local OS, run this as root: "useradd maria_dev". This is a one-time prep operation.&lt;/P&gt;</description>
      <pubDate>Mon, 18 Apr 2016 11:59:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-while-populating-HIVE-table-with-HDFS-data/m-p/171346#M25410</guid>
      <dc:creator>pminovic</dc:creator>
      <dc:date>2016-04-18T11:59:52Z</dc:date>
    </item>
  </channel>
</rss>

