When trying to go through the tutorials on http://hortonworks.com/hadoop-tutorial/hello-world... I get a permission denied error when trying to LOAD DATA INPATH '/tmp/admin/data/trucks.csv' OVERWRITE INTO TABLE trucks_stage;
This is using a brand new sandbox 2.3.2 instance in Azure. I did give the trucks.csv file WRITE permission for all 3. I am logged into Ambari as "admin". It appears the operation is running as "hive", which makes sense since that shows as the service account.
INFO : Loading data to table default.trucks_stage from hdfs://sandbox.hortonworks.com:8020/tmp/admin/data/trucks.csv ERROR : Failed with exception Unable to move source hdfs://sandbox.hortonworks.com:8020/tmp/admin/data/trucks.csv to destination hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/trucks_stage/trucks.csv org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source hdfs://sandbox.hortonworks.com:8020/tmp/admin/data/trucks.csv to destination hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/trucks_stage/trucks.csv
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/tmp/admin/data/trucks.csv":admin:hdfs:drwxr-xr-x
@Peter Lasne As you see, only admin user has write access which is resulting into this issue. Please give write access to Hive user here and that should fix issue. Usually I would recommend having rwxrwxrwx on /tmp.
Thanks, granting permission for all users from /tmp up through /data worked fine. I am confused why the tutorial from http://hortonworks.com/hadoop-tutorial/hello-world... specifies only changing the permissions on the file if more than that is required.