Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HDP2.4.2 Spark jobs log can't be access

avatar
Expert Contributor

I've enabled yarn ranger plugin under ambari.also configured yarn.admin.acl with "dr.who". but after I submit job, I found I can't access logs. It shows below:

User [dr.who] is not authorized to view the logs for container_e18_1464594716018_0007_01_000001 in log file [host1_45454]

No logs available for container container_e18_1464594716018_0007_01_000001

anyone has same problems?thanks in advance.

1 ACCEPTED SOLUTION

avatar
Super Guru
@henryon wen

There were multiple solution related to same issue, can you try setting below properties?

In yarn-site.xml if kerberos is enabled.

<property>
    <description>ACL of who can be admin of the YARN cluster.</description>
    <name>yarn.admin.acl</name>
    <value>dr.who</value>
  </property>

Also.

<property>
    <name>yarn.acl.enable</name>
    <value>false</value>
  </property>

View solution in original post

3 REPLIES 3

avatar
Guru

Please take a look at this post https://community.hortonworks.com/questions/2349/tip-when-you-get-a-message-in-job-log-user-dr-who.h...

There are different ways to fix this issue, one of which is to put hadoop.http.staticuser.user=yarn in core-site.xml. More details in the linked thread.

avatar
Expert Contributor

I tried hadoop.http.staticuser.user=yarn in core-site.xml. it doesn't work on hdp2.4.2.

When I submit jobs with that setting. it will show some logs like below.

For more detailed output, check application tracking page:http://master02.com:8088/cluster/app/application_1464688798017_0006Then, click on links to logs of each attempt. Diagnostics: Exception from container-launch. Container id: container_e20_1464688798017_0006_02_000001 Exit code: 15 Stack trace: ExitCodeException exitCode=15: at org.apache.hadoop.util.Shell.runCommand(Shell.java:576) at org.apache.hadoop.util.Shell.run(Shell.java:487) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:303) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

avatar
Super Guru
@henryon wen

There were multiple solution related to same issue, can you try setting below properties?

In yarn-site.xml if kerberos is enabled.

<property>
    <description>ACL of who can be admin of the YARN cluster.</description>
    <name>yarn.admin.acl</name>
    <value>dr.who</value>
  </property>

Also.

<property>
    <name>yarn.acl.enable</name>
    <value>false</value>
  </property>