<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Cannot read log files on applications started by Active Directory users in YARN in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214542#M60164</link>
    <description>&lt;P&gt;
	I finally got it to work!&lt;/P&gt;&lt;P&gt;	
The problem had to do with that sss was configured to use fully qualified names.&lt;/P&gt;&lt;P&gt;
	I could either change the hadoop.security.auth_to_local in core-site.xml -&amp;gt; to not convert myaduser@NET.LOCAL to myaduser. This would make my hdfs user to myaduser@NET.LOCAL and I could see the logs.&lt;/P&gt;&lt;P&gt;
	The alternative that we went for is to change the following settings in /etc/sssd/sssd.conf&lt;/P&gt;&lt;P&gt;	Remove:&lt;/P&gt;&lt;PRE&gt;default_domain_suffix = net.local&lt;/PRE&gt;&lt;P&gt;
	Change:&lt;/P&gt;&lt;PRE&gt;use_fully_qualified_names = False&lt;/PRE&gt;	And run on each machine:
&lt;PRE&gt;systemctl restart sssd
sss_cache -E&lt;/PRE&gt;</description>
    <pubDate>Fri, 05 May 2017 19:48:12 GMT</pubDate>
    <dc:creator>sjogren_nils</dc:creator>
    <dc:date>2017-05-05T19:48:12Z</dc:date>
    <item>
      <title>Cannot read log files on applications started by Active Directory users in YARN</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214539#M60161</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I am trying to read logs from jobs started by an AD user, but I get the following error message (both in the web UI and using yarn logs -applicationId &amp;lt;id&amp;gt;):&lt;/P&gt;&lt;PRE&gt;Exception reading log file. Application submitted by 'myaduser' doesn't own requested log file : directory.info&lt;/PRE&gt;&lt;P&gt;I can however read data from system users (hive, zeppelin etc.).&lt;/P&gt;&lt;P&gt;When I look at the nodemanager logs I see the following error:&lt;/P&gt;&lt;PRE&gt;java.io.IOException: Owner 'myaduser@net.local' for path /var/log/hadoop/yarn/log/application_1493304120821_0017/container_e38_1493304120821_0017_01_000001/directory.info did not match expected owner 'myaduser'
        at org.apache.hadoop.io.SecureIOUtils.checkStat(SecureIOUtils.java:285)
        at org.apache.hadoop.io.SecureIOUtils.forceSecureOpenForRead(SecureIOUtils.java:219)
        at org.apache.hadoop.io.SecureIOUtils.openForRead(SecureIOUtils.java:204)
        at org.apache.hadoop.yarn.server.nodemanager.webapp.ContainerLogsUtils.openLogFileForRead(ContainerLogsUtils.java:170)
        at org.apache.hadoop.yarn.server.nodemanager.webapp.ContainerLogsPage$ContainersLogsBlock.printLogFile(ContainerLogsPage.java:135)
        at org.apache.hadoop.yarn.server.nodemanager.webapp.ContainerLogsPage$ContainersLogsBlock.render(ContainerLogsPage.java:109)
        at org.apache.hadoop.yarn.webapp.view.HtmlBlock.render(HtmlBlock.java:69)
        at org.apache.hadoop.yarn.webapp.view.HtmlBlock.renderPartial(HtmlBlock.java:79)
        at org.apache.hadoop.yarn.webapp.View.render(View.java:235)
        at org.apache.hadoop.yarn.webapp.view.HtmlPage$Page.subView(HtmlPage.java:49)
        at org.apache.hadoop.yarn.webapp.hamlet.HamletImpl$EImp._v(HamletImpl.java:117)
        at org.apache.hadoop.yarn.webapp.hamlet.Hamlet$TD._(Hamlet.java:845)
        at org.apache.hadoop.yarn.webapp.view.TwoColumnLayout.render(TwoColumnLayout.java:56)
        at org.apache.hadoop.yarn.webapp.view.HtmlPage.render(HtmlPage.java:82)
        at org.apache.hadoop.yarn.webapp.Controller.render(Controller.java:212)
        at org.apache.hadoop.yarn.server.nodemanager.webapp.NMController.logs(NMController.java:70)
        at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.yarn.webapp.Dispatcher.service(Dispatcher.java:162)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
        at com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:263)
        at com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:178)
        at com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)
        at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:62)
        at com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:900)
        at com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:834)
        at org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebAppFilter.doFilter(NMWebAppFilter.java:72)
        at com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:795)
        at com.google.inject.servlet.FilterDefinition.doFilter(FilterDefinition.java:163)
        at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:58)
        at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:118)
        at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:113)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.apache.hadoop.security.http.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:57)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:617)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:576)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter.doFilter(StaticUserWebFilter.java:109)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1409)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
        at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)
        at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
        at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
        at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
        at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
        at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
        at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
        at org.mortbay.jetty.Server.handle(Server.java:326)
        at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
        at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
        at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
        at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
&lt;/PRE&gt;&lt;P&gt;This occurs when I run both hive queries and Spark through spark-submit and Zeppelin with livy (impersonation).&lt;/P&gt;&lt;P&gt;/var/log/hadoop/yarn/log/application_1493304120821_0017/container_e38_1493304120821_0017_01_000001/directory.info is owned by myaduser@net.local:hadoop. myaduser@net.local is not in the group hadoop however.&lt;/P&gt;&lt;P&gt;In all services we are using only the username, not with the realm. For example the hadoop user is only "myaduser" (not myaduser@NET.LOCAL and the home folder is /user/myaduser/.&lt;/P&gt;</description>
      <pubDate>Fri, 28 Apr 2017 16:27:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214539#M60161</guid>
      <dc:creator>sjogren_nils</dc:creator>
      <dc:date>2017-04-28T16:27:14Z</dc:date>
    </item>
    <item>
      <title>Re: Cannot read log files on applications started by Active Directory users in YARN</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214540#M60162</link>
      <description>&lt;P&gt; can you check out the mapping rules :&lt;/P&gt;&lt;P&gt;$ hadoop org.apache.hadoop.security.HadoopKerberosName &amp;lt;username&amp;gt; &lt;/P&gt;</description>
      <pubDate>Thu, 04 May 2017 00:30:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214540#M60162</guid>
      <dc:creator>rjain1</dc:creator>
      <dc:date>2017-05-04T00:30:29Z</dc:date>
    </item>
    <item>
      <title>Re: Cannot read log files on applications started by Active Directory users in YARN</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214541#M60163</link>
      <description>&lt;PRE&gt;$ hadoop org.apache.hadoop.security.HadoopKerberosName myaduser
Name: myaduser to myaduser&lt;/PRE&gt;&lt;P&gt;
&lt;/P&gt;&lt;PRE&gt;$hadoop org.apache.hadoop.security.HadoopKerberosName myaduser@NET.LOCAL
Name: myaduser@NET.LOCAL to myaduser&lt;/PRE&gt;</description>
      <pubDate>Thu, 04 May 2017 16:35:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214541#M60163</guid>
      <dc:creator>sjogren_nils</dc:creator>
      <dc:date>2017-05-04T16:35:09Z</dc:date>
    </item>
    <item>
      <title>Re: Cannot read log files on applications started by Active Directory users in YARN</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214542#M60164</link>
      <description>&lt;P&gt;
	I finally got it to work!&lt;/P&gt;&lt;P&gt;	
The problem had to do with that sss was configured to use fully qualified names.&lt;/P&gt;&lt;P&gt;
	I could either change the hadoop.security.auth_to_local in core-site.xml -&amp;gt; to not convert myaduser@NET.LOCAL to myaduser. This would make my hdfs user to myaduser@NET.LOCAL and I could see the logs.&lt;/P&gt;&lt;P&gt;
	The alternative that we went for is to change the following settings in /etc/sssd/sssd.conf&lt;/P&gt;&lt;P&gt;	Remove:&lt;/P&gt;&lt;PRE&gt;default_domain_suffix = net.local&lt;/PRE&gt;&lt;P&gt;
	Change:&lt;/P&gt;&lt;PRE&gt;use_fully_qualified_names = False&lt;/PRE&gt;	And run on each machine:
&lt;PRE&gt;systemctl restart sssd
sss_cache -E&lt;/PRE&gt;</description>
      <pubDate>Fri, 05 May 2017 19:48:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/214542#M60164</guid>
      <dc:creator>sjogren_nils</dc:creator>
      <dc:date>2017-05-05T19:48:12Z</dc:date>
    </item>
    <item>
      <title>Re: Cannot read log files on applications started by Active Directory users in YARN</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/283476#M60165</link>
      <description>&lt;P&gt;Rightly said. Above SSSD config change will help here.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Along with above SSSD change and restart, don't forget to restart the involved Hadoop daemons (like Nodemanager etc). This is needed to rebuild the in-memory cache which holds the UID -&amp;gt; Username mapping for up to 4 hours without invalidation [1]&lt;/P&gt;&lt;PRE&gt;[1]: Ref: org.apache.hadoop.fs.CommonConfigurationKeys
----
  public static final String HADOOP_SECURITY_UID_NAME_CACHE_TIMEOUT_KEY =
    "hadoop.security.uid.cache.secs";

  public static final long HADOOP_SECURITY_UID_NAME_CACHE_TIMEOUT_DEFAULT =
    4*60*60; // 4 hours
----&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 20 Nov 2019 12:25:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Cannot-read-log-files-on-applications-started-by-Active/m-p/283476#M60165</guid>
      <dc:creator>Lingesh</dc:creator>
      <dc:date>2019-11-20T12:25:21Z</dc:date>
    </item>
  </channel>
</rss>

