<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284513#M211264</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59349"&gt;@mike_bronson7&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;That's a classic permissions issue "airflow" is trying to write to that directory but has no permissions as it's owned by alapati user&amp;nbsp; the&lt;STRONG&gt; inode=alapati. &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;The&amp;nbsp; easiest solution is to grant the permissions as the hdfs user&lt;/P&gt;&lt;P&gt;&lt;FONT color="#FF6600"&gt;$ hdfs dfs -chown airflow:{$airflow_group}&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;Most components like Spark, Hive, sqoop&amp;nbsp; need to access HDFS&lt;/FONT&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 02 Dec 2019 07:56:29 GMT</pubDate>
    <dc:creator>Shelton</dc:creator>
    <dc:date>2019-12-02T07:56:29Z</dc:date>
    <item>
      <title>SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284512#M211263</link>
      <description>&lt;P&gt;we are runs spark application on hadoop cluster ( HDP version - 2.6.5 from hortonworks )&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;from the logs we can see the following Diagnostics&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;User: airflow&lt;BR /&gt;Application Type: SPARK&lt;BR /&gt;User class threw exception: org.apache.hadoop.security.AccessControlException: Permission denied. user=airflow is not the owner of inode=alapati&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;not clearly what we need to search in `HDFS` in order to find why we get Permission denied&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 09:30:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284512#M211263</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T09:30:27Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284513#M211264</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59349"&gt;@mike_bronson7&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;That's a classic permissions issue "airflow" is trying to write to that directory but has no permissions as it's owned by alapati user&amp;nbsp; the&lt;STRONG&gt; inode=alapati. &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;The&amp;nbsp; easiest solution is to grant the permissions as the hdfs user&lt;/P&gt;&lt;P&gt;&lt;FONT color="#FF6600"&gt;$ hdfs dfs -chown airflow:{$airflow_group}&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;Most components like Spark, Hive, sqoop&amp;nbsp; need to access HDFS&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 07:56:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284513#M211264</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-12-02T07:56:29Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284515#M211266</link>
      <description>&lt;P&gt;how to found the&amp;nbsp;&lt;SPAN&gt;airflow_group ?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 08:10:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284515#M211266</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T08:10:15Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284516#M211267</link>
      <description>&lt;P&gt;Dear Shelton&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;do you mean :&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;grep airflow /etc/passwd&lt;BR /&gt;airflow:x:1016:1016::/home/airflow:/sbin/nologin&lt;BR /&gt;# id 1016&lt;BR /&gt;uid=1016(airflow) gid=1016(airflow) groups=1016(airflow),1005(hdfs)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;so we need to perform :&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;$ hdfs dfs -chown airflow:1016&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;?&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 08:17:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284516#M211267</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T08:17:25Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284521#M211271</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59349"&gt;@mike_bronson7&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The &lt;STRONG&gt;gid&lt;/STRONG&gt; is numeric&amp;nbsp; value just to indicate the id but the valid groups for airflow&amp;nbsp; are [&lt;STRONG&gt;airflow&lt;/STRONG&gt; and &lt;STRONG&gt;hdfs&lt;/STRONG&gt;]&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF6600"&gt;$ hdfs dfs -chown airflow:hdfs&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Should do the magic, please revert&lt;/P&gt;&lt;P&gt;Cheers&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 08:45:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284521#M211271</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-12-02T08:45:34Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284522#M211272</link>
      <description>&lt;P&gt;we got the following , because missing PATH&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;$ hdfs dfs -chown airflow:hdfs&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;-chown: Not enough arguments: expected 2 but got 1&lt;BR /&gt;Usage: hadoop fs [generic options]&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 09:01:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284522#M211272</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T09:01:17Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284523#M211273</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ok , we meed to find the path's for the changing&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 11:43:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284523#M211273</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T11:43:50Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284527#M211277</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59349"&gt;@mike_bronson7&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;That's correct you need to give the path of the directory &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt; ie usually in hdfs&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF6600"&gt;$ hdfs dfs -chown airflow:hdfs&amp;nbsp; &amp;nbsp;/path/in/hdfs/where/you/failed/to/write&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;As you didn't include the path I assumed you'd do that with the &lt;STRONG&gt;-chown&lt;/STRONG&gt; command&amp;nbsp;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 09:42:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284527#M211277</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-12-02T09:42:31Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284529#M211279</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;so&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;when we do&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;hdfs dfs -ls -R / | grep "airflow " | awk '{print $1" "$2" "$3" "$4" "}'&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;we get:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;drwxrwx--- - airflow hadoop&lt;BR /&gt;drwxrwx--- - airflow hadoop&lt;BR /&gt;drwxrwx--- - airflow hadoop&lt;BR /&gt;-rw-r----- 3&amp;nbsp; &amp;nbsp;airflow hadoop&lt;BR /&gt;-rw-r----- 3&amp;nbsp; &amp;nbsp;airflow hadoop&lt;BR /&gt;-rw-r----- 3&amp;nbsp; &amp;nbsp;airflow hadoop&lt;BR /&gt;drwxrwx--- - airflow hadoop&lt;BR /&gt;-rw-r----- 3&amp;nbsp; &amp;nbsp;airflow hadoop&lt;BR /&gt;-rw-r----- 3&amp;nbsp; &amp;nbsp;airflow hadoop&lt;BR /&gt;-rw-r----- 3&amp;nbsp; &amp;nbsp;airflow hadoop&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;do you means to change every hadoop group to hdfs ?&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;or in simple words - how to know the HDFS path , that we need to change?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 10:12:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284529#M211279</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T10:12:22Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284530#M211280</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59349"&gt;@mike_bronson7&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The hadoop group encapsulates all the users including hdfs&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;You do run a&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF6600"&gt;# cat /etc/group&amp;nbsp;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;You should see someing like like&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;hadoop:x:1007:yarn-ats,hive,storm,infra-solr,zookeeper,oozie,atlas,ams,ranger,tez,zeppelin,kms,accumulo,livy,druid,spark,ambari-qa,kafka,hdfs,s qoop,yarn,mapred,hbase,knox&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So running the -chown should only target the directory in t&lt;SPAN&gt;he Diagnostics logs&amp;nbsp; &lt;STRONG&gt;NEVER&lt;/STRONG&gt;&amp;nbsp; run the&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;STRONG&gt;&amp;nbsp;-chown&lt;/STRONG&gt; command on &lt;STRONG&gt;/&lt;/STRONG&gt;&amp;nbsp; which is the root directory !!&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Can you share your log please&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 10:25:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284530#M211280</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-12-02T10:25:14Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284531#M211281</link>
      <description>&lt;P&gt;yes we get the following:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;cat /etc/group | grep -i hadoop&lt;BR /&gt;hadoop:x:1006:hive,livy,zookeeper,spark,ams,kafka,yarn,hcat,mapred&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;cat /etc/group | grep -i airflow&lt;BR /&gt;hdfs:x:1005:hdfs,hive,airflow&lt;BR /&gt;airflow:x:1016:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;cat /etc/group | grep -i hdfs&lt;BR /&gt;hdfs:x:1005:hdfs,hive,airflow&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;let me know if you need additional info?&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 10:49:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284531#M211281</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2019-12-02T10:49:51Z</dc:date>
    </item>
    <item>
      <title>Re: SPARK Application + HDFS + User Airflow is not the owner of inode=alapati</title>
      <link>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284578#M211305</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59349"&gt;@mike_bronson7&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;You can change the ownership of the HDFS&amp;nbsp; directory to &lt;STRONG&gt;airflow:hadoop&lt;/STRONG&gt;&amp;nbsp; please do run the &lt;STRONG&gt;-chown&lt;/STRONG&gt; command on &lt;STRONG&gt;/&lt;/STRONG&gt; ??? It should something like &lt;STRONG&gt;/users/airflow/xxx&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Please let me know&lt;/P&gt;</description>
      <pubDate>Mon, 02 Dec 2019 22:06:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/SPARK-Application-HDFS-User-Airflow-is-not-the-owner-of/m-p/284578#M211305</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-12-02T22:06:09Z</dc:date>
    </item>
  </channel>
</rss>

