Created on 12-01-2019 11:45 PM - last edited on 12-02-2019 01:30 AM by VidyaSargur
we are runs spark application on hadoop cluster ( HDP version - 2.6.5 from hortonworks )
from the logs we can see the following Diagnostics
User: airflow
Application Type: SPARK
User class threw exception: org.apache.hadoop.security.AccessControlException: Permission denied. user=airflow is not the owner of inode=alapati
not clearly what we need to search in `HDFS` in order to find why we get Permission denied
Created 12-02-2019 02:06 PM
You can change the ownership of the HDFS directory to airflow:hadoop please do run the -chown command on / ??? It should something like /users/airflow/xxx
Please let me know
Created on 12-02-2019 02:48 AM - edited 12-02-2019 02:49 AM
yes we get the following:
cat /etc/group | grep -i hadoop
hadoop:x:1006:hive,livy,zookeeper,spark,ams,kafka,yarn,hcat,mapred
cat /etc/group | grep -i airflow
hdfs:x:1005:hdfs,hive,airflow
airflow:x:1016:
cat /etc/group | grep -i hdfs
hdfs:x:1005:hdfs,hive,airflow
let me know if you need additional info?
Created 12-02-2019 02:06 PM
You can change the ownership of the HDFS directory to airflow:hadoop please do run the -chown command on / ??? It should something like /users/airflow/xxx
Please let me know