<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied - While doing Sqoop in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283152#M210465</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/71064"&gt;@sagittarian&lt;/a&gt;&amp;nbsp;&lt;SPAN&gt;Thanks for letting us know you solved your issue. If you could mark the appropriate reply above as the solution (by clicking the&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;Accept as Solution&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;button) it would help others in a similar situation find it in the future.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Sat, 16 Nov 2019 03:42:40 GMT</pubDate>
    <dc:creator>ask_bill_brooks</dc:creator>
    <dc:date>2019-11-16T03:42:40Z</dc:date>
    <item>
      <title>org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied - While doing Sqoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/282822#M210215</link>
      <description>&lt;P&gt;Hi Everyone, I am new to Hadoop Ecosystem. I have installed Cloudera on Ubuntu and while i was performing simple sqoop from Postgres to Hive/HDFS getting following error. Appreciate the help:&lt;/P&gt;
&lt;P&gt;Caused by: &lt;STRONG&gt;&lt;U&gt;org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=nthumu, access=WRITE, inode="/user/hive/warehouse/postgres":hdfs:hive:drwxrwx--x&lt;/U&gt;&lt;/STRONG&gt;&lt;BR /&gt;at &lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot from 2019-11-12 23-49-44.png" style="width: 999px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/25312iD86FC34114B9DF48/image-size/large?v=v2&amp;amp;px=999" role="button" title="Screenshot from 2019-11-12 23-49-44.png" alt="Screenshot from 2019-11-12 23-49-44.png" /&gt;&lt;/span&gt;org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:400)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:256)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:194)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1855)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1839)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1798)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2374)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2318)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:771)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:451)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)&lt;/P&gt;
&lt;P&gt;at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1445)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1355)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)&lt;BR /&gt;at com.sun.proxy.$Proxy10.create(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)&lt;BR /&gt;at com.sun.proxy.$Proxy11.create(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276)&lt;/P&gt;</description>
      <pubDate>Wed, 13 Nov 2019 06:50:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/282822#M210215</guid>
      <dc:creator>sagittarian</dc:creator>
      <dc:date>2019-11-13T06:50:57Z</dc:date>
    </item>
    <item>
      <title>Re: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied - While doing Sqoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/282826#M210219</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/71064"&gt;@sagittarian&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As we see the error "" means permission related error on the mentioned directory.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Later when we look at the directory ownership then we find it as&amp;nbsp;&amp;nbsp;"&lt;STRONG&gt;&lt;U&gt;"/user/hive/warehouse/postgres":&lt;FONT color="#FF0000"&gt;hdfs:hive&lt;/FONT&gt;:drwxrwx--x"&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Usually, the "/user/hive" or "/user/hive/warehouse" directory ownership is like &lt;FONT color="#008000"&gt;&lt;STRONG&gt;"hive:hdfs"&amp;nbsp;&lt;/STRONG&gt;&lt;/FONT&gt; (where "hdfs" is the superuser group),&amp;nbsp; &amp;nbsp;But in your case it is&amp;nbsp; hdfs:hive (instead of hive::hdfs)&lt;BR /&gt;&lt;STRONG&gt;Example: (please double check these dir permissions)&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# su - hdfs -c "hdfs dfs -ls /user/hive"
# su - hdfs -c "hdfs dfs -ls /user/hive/warehouse"
# su - hdfs -c "hdfs dfs -ls /user/hive/warehouse/postgres"&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So please verify that once.&amp;nbsp; And also please verify if the user &lt;STRONG&gt;"nthumu"&lt;/STRONG&gt; belongs to the correct Group?&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# id nthumu&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Nov 2019 05:51:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/282826#M210219</guid>
      <dc:creator>jsensharma</dc:creator>
      <dc:date>2019-11-13T05:51:06Z</dc:date>
    </item>
    <item>
      <title>Re: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied - While doing Sqoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283149#M210462</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/50614"&gt;@jsensharma&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks For Quick response, i ran the commands you have mentioned and i am getting below message&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;"&lt;U&gt;&lt;STRONG&gt;l&lt;/STRONG&gt;&lt;STRONG&gt;&lt;U&gt;s: &lt;/U&gt;Permission denied: user=nthumu, access=READ_EXECUTE, inode="/user/hive/warehouse":hdfs:hive:drwxrwx--x"&lt;/STRONG&gt;&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Appreciate Your help. Thanks&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2019 01:45:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283149#M210462</guid>
      <dc:creator>sagittarian</dc:creator>
      <dc:date>2019-11-16T01:45:08Z</dc:date>
    </item>
    <item>
      <title>Re: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied - While doing Sqoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283150#M210463</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/50614"&gt;@jsensharma&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;able to solve the problem by running below command&lt;/P&gt;&lt;P&gt;"export HADOOP_USER_NAME=hdfs"&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2019 03:00:37 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283150#M210463</guid>
      <dc:creator>sagittarian</dc:creator>
      <dc:date>2019-11-16T03:00:37Z</dc:date>
    </item>
    <item>
      <title>Re: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied - While doing Sqoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283152#M210465</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/71064"&gt;@sagittarian&lt;/a&gt;&amp;nbsp;&lt;SPAN&gt;Thanks for letting us know you solved your issue. If you could mark the appropriate reply above as the solution (by clicking the&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;Accept as Solution&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;button) it would help others in a similar situation find it in the future.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2019 03:42:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/org-apache-hadoop-ipc-RemoteException-org-apache-hadoop/m-p/283152#M210465</guid>
      <dc:creator>ask_bill_brooks</dc:creator>
      <dc:date>2019-11-16T03:42:40Z</dc:date>
    </item>
  </channel>
</rss>

