Created 10-04-2021 07:49 PM
Examples : [ledapp3@leda-edge-u1-n01 ~]$ yarn logs -applicationId application_1616602302580_19604 >crash.log WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS. Permission denied: user=ledapp3, access=READ_EXECUTE, inode="/tmp/logs/hive/logs/application_1616602302580_19604":hive:hadoop:drwxrwx--- [calapr01@callisto-edge-u2-n01 ~]$ yarn logs -applicationId application_1623858219829_386030 | head WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS. Permission denied: user=calapr01, access=READ_EXECUTE, inode="/tmp/logs/hive/logs/application_1623858219829_386030":hive:hadoop:drwxrwx--- [gaiapr01@ganymede-edge-u2-n01 ~]$ yarn logs --applicationId application_1631641196769_7918 | head WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS. 21/10/01 15:23:44 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm249 Permission denied: user=gaiapr01, access=EXECUTE, inode="/tmp/logs/hive":hive:hadoop:drwxrwx-- [gaiapr01@io-edge-u1-n01 ~]$yarn logs --applicationId=application_1626699923211_6165 WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS. 21/10/01 15:27:05 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm204 Permission denied: user=gaiapr01, access=EXECUTE, inode="/tmp/logs/hive":hive:hadoop:drwxrwx-- The problem seems to be that it is the user hive that creates such log files, and so the applicative accounts that launch the jobs cannot access it, and so we can not access it. We need the applicative accounts to have Read permissions on those log files, otherwise we cannot figure out why our jobs failed since we do not have access to logs.
We tried to check the ACL rights :
[smohanty@io-edge-u1-n01 ~]$ hdfs dfs -getfacl /tmp/logs/hive/logs/
# file: /tmp/logs/hive/logs
# owner: hive
# group: hadoop
user::rwx
group::rwx
group:dsi_moe_bigdata_prod:r-x
mask::rwx
other::---
default:user::rwx
default:group::rwx
default:group:dsi_moe_bigdata_prod:r-x
default:mask::rwx
default:other::---
[smohanty@io-edge-u1-n01 ~]$
[smohanty@callisto-edge-u2-n01 ~]$ hdfs dfs -getfacl /tmp/logs/hive/logs/
# file: /tmp/logs/hive/logs
# owner: hive
# group: hadoop
user::rwx
user:calapr00:rwx
user:calapr01:rwx
group::rwx
group:dsi_moe_dev_bigdata:r-x
mask::rwx
other::---
default:user::rwx
default:group::rwx
default:group:dsi_moe_dev_bigdata:r-x
default:mask::rwx
default:other::---
[smohanty@callisto-edge-u2-n01 ~]$
Is there any ACL modification needed
Created 10-10-2021 10:48 PM
Hello , I applied below solution to do the fix.
hdfs dfs -setfacl -R -m group:calapr01:r-x /tmp/logs/*
hdfs dfs -setfacl -m default:group:calapr01:r-x /tmp/logs/hive/logs
Seems I found this link helpful :
Created 10-05-2021 06:06 AM
Hi @Swagat
(1). Can you run the below command in your Linux terminal and re-run the "yarn logs -applicationID <appID>" command
export HADOOP_USER_NAME=hdfs
(2). Based on the logs parent folder "/tmp/logs/hive" and its subfolders are with 770 permission/hive:hadoop so the other users do not have permission to access it. The users ledapp3, calapr01, and gaiapr01 will fall under others so they are not able to access it.
Can you run below command as HDFS user and ask your end users to try again.
hdfs dfs -chmod 775 <parent folder>
hdfs dfs -chmod -R 775 <parent folder>
hdfs dfs -chmod -R 777 <parent folder> ###Least Recommended.
Similar Issue Reference:
If you are happy with the reply, mark it Accept as Solution
Created 10-10-2021 10:37 PM
@Swagat, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
Regards,
Vidya Sargur,Created 10-10-2021 10:48 PM
Hello , I applied below solution to do the fix.
hdfs dfs -setfacl -R -m group:calapr01:r-x /tmp/logs/*
hdfs dfs -setfacl -m default:group:calapr01:r-x /tmp/logs/hive/logs
Seems I found this link helpful :