Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

SPARK Application + HDFS + User Airflow is not the owner of inode=alapati

avatar

we are runs spark application on hadoop cluster ( HDP version - 2.6.5 from hortonworks )

 

from the logs we can see the following Diagnostics

 

User: airflow
Application Type: SPARK
User class threw exception: org.apache.hadoop.security.AccessControlException: Permission denied. user=airflow is not the owner of inode=alapati


not clearly what we need to search in `HDFS` in order to find why we get Permission denied

Michael-Bronson
1 ACCEPTED SOLUTION

avatar
Master Mentor

@mike_bronson7 

 

You can change the ownership of the HDFS  directory to airflow:hadoop  please do run the -chown command on / ??? It should something like /users/airflow/xxx

Please let me know

View solution in original post

11 REPLIES 11

avatar

yes we get the following:

 

cat /etc/group | grep -i hadoop
hadoop:x:1006:hive,livy,zookeeper,spark,ams,kafka,yarn,hcat,mapred

 

cat /etc/group | grep -i airflow
hdfs:x:1005:hdfs,hive,airflow
airflow:x:1016:

 

cat /etc/group | grep -i hdfs
hdfs:x:1005:hdfs,hive,airflow

 

let me know if you need additional info?

Michael-Bronson

avatar
Master Mentor

@mike_bronson7 

 

You can change the ownership of the HDFS  directory to airflow:hadoop  please do run the -chown command on / ??? It should something like /users/airflow/xxx

Please let me know