Support Questions
Find answers, ask questions, and share your expertise

yarn logs of hive on spark requests are not accessible

New Contributor

 

 

Examples :

 

[ledapp3@leda-edge-u1-n01 ~]$ yarn logs -applicationId application_1616602302580_19604 >crash.log

WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS.

Permission denied: user=ledapp3, access=READ_EXECUTE, inode="/tmp/logs/hive/logs/application_1616602302580_19604":hive:hadoop:drwxrwx---

 

[calapr01@callisto-edge-u2-n01 ~]$ yarn logs -applicationId application_1623858219829_386030 | head

WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS.

Permission denied: user=calapr01, access=READ_EXECUTE, inode="/tmp/logs/hive/logs/application_1623858219829_386030":hive:hadoop:drwxrwx---

 

[gaiapr01@ganymede-edge-u2-n01 ~]$ yarn logs --applicationId application_1631641196769_7918 | head

WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS.

21/10/01 15:23:44 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm249

Permission denied: user=gaiapr01, access=EXECUTE, inode="/tmp/logs/hive":hive:hadoop:drwxrwx--

 

[gaiapr01@io-edge-u1-n01 ~]$yarn logs --applicationId=application_1626699923211_6165

WARNING: YARN_OPTS has been replaced by HADOOP_OPTS. Using value of YARN_OPTS.

21/10/01 15:27:05 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm204

Permission denied: user=gaiapr01, access=EXECUTE, inode="/tmp/logs/hive":hive:hadoop:drwxrwx--

 

The problem seems to be that it is the user hive that creates such log files, and so the applicative accounts that launch the jobs cannot access it, and so we can not access it.

We need the applicative accounts to have Read permissions on those log files, otherwise we cannot figure out why our jobs failed since we do not have access to logs.



We tried to check the ACL rights :

[smohanty@io-edge-u1-n01 ~]$ hdfs dfs -getfacl /tmp/logs/hive/logs/
# file: /tmp/logs/hive/logs
# owner: hive
# group: hadoop
user::rwx
group::rwx
group:dsi_moe_bigdata_prod:r-x
mask::rwx
other::---
default:user::rwx
default:group::rwx
default:group:dsi_moe_bigdata_prod:r-x
default:mask::rwx
default:other::---

[smohanty@io-edge-u1-n01 ~]$

[smohanty@callisto-edge-u2-n01 ~]$ hdfs dfs -getfacl /tmp/logs/hive/logs/
# file: /tmp/logs/hive/logs
# owner: hive
# group: hadoop
user::rwx
user:calapr00:rwx
user:calapr01:rwx
group::rwx
group:dsi_moe_dev_bigdata:r-x
mask::rwx
other::---
default:user::rwx
default:group::rwx
default:group:dsi_moe_dev_bigdata:r-x
default:mask::rwx
default:other::---

[smohanty@callisto-edge-u2-n01 ~]$

 

Is there any ACL modification needed

1 ACCEPTED SOLUTION

Accepted Solutions

New Contributor

Hello , I applied below solution to do the fix.

hdfs dfs -setfacl -R -m group:calapr01:r-x /tmp/logs/*

hdfs dfs -setfacl -m default:group:calapr01:r-x /tmp/logs/hive/logs

Seems I found this link helpful :

https://my.cloudera.com/knowledge/ERROR-quotPermission-denied-userltusernamegt-accessREADEXECUTE?id=...

 

View solution in original post

3 REPLIES 3

Rising Star

Hi @Swagat 

 

(1). Can you run the below command in your Linux terminal and re-run the "yarn logs -applicationID <appID>" command
export HADOOP_USER_NAME=hdfs

 

(2). Based on the logs parent folder "/tmp/logs/hive" and its subfolders are with 770 permission/hive:hadoop so the other users do not have permission to access it. The users ledapp3, calapr01, and gaiapr01 will fall under others so they are not able to access it.

 

Can you run below command as HDFS user and ask your end users to try again.
hdfs dfs -chmod 775 <parent folder>
hdfs dfs -chmod -R 775 <parent folder>
hdfs dfs -chmod -R 777 <parent folder> ###Least Recommended.

 

Similar Issue Reference:

https://community.cloudera.com/t5/Support-Questions/Permission-denied-user-root-access-WRITE-inode-q...

https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-dir...

 

If you are happy with the reply, mark it Accept as Solution

Community Manager

@Swagat, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.  


Regards,

Vidya Sargur,
Community Manager

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

New Contributor

Hello , I applied below solution to do the fix.

hdfs dfs -setfacl -R -m group:calapr01:r-x /tmp/logs/*

hdfs dfs -setfacl -m default:group:calapr01:r-x /tmp/logs/hive/logs

Seems I found this link helpful :

https://my.cloudera.com/knowledge/ERROR-quotPermission-denied-userltusernamegt-accessREADEXECUTE?id=...

 

View solution in original post