<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark logs location in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126394#M89125</link>
    <description>&lt;P&gt;I looked at "yarn.nodemanager.log-dirs" in YARN but it seems YARN will clear all the logs immediately after completion of the job.&lt;/P&gt;</description>
    <pubDate>Wed, 13 Jul 2016 13:35:22 GMT</pubDate>
    <dc:creator>divakarreddy_a</dc:creator>
    <dc:date>2016-07-13T13:35:22Z</dc:date>
    <item>
      <title>Spark logs location</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126391#M89122</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P style="margin-left: 20px;"&gt;We are running spark jobs and knew that YARN will create logs on hdfs at /app-logs/&amp;lt;running User&amp;gt;/logs/application_1463538185607_99971&lt;/P&gt;&lt;P style="margin-left: 20px;"&gt;To know more details about logs we can run yarn logs -applicationId application_1463538185607_99971&lt;/P&gt;&lt;P style="margin-left: 20px;"&gt;&lt;/P&gt;&lt;P style="margin-left: 20px;"&gt;But we are working on Spark Automation process and trying to keep the logs in Custom location. In-order to achieve this we added "log4j.appender.rolling.file" property in "Custom spark-log4j-properties" section through Ambari.&lt;/P&gt;&lt;P&gt;    log4j.appender.rolling.file= ${spark.yarn.app.container.log.dir}/spark.log&lt;/P&gt;&lt;P&gt;Here I'm not sure where Spark is going to create logs for sucessfull/Failed jobs.&lt;/P&gt;&lt;P&gt;Can you suggest me where can we check this spark logs?&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jul 2016 12:42:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126391#M89122</guid>
      <dc:creator>divakarreddy_a</dc:creator>
      <dc:date>2016-07-13T12:42:03Z</dc:date>
    </item>
    <item>
      <title>Re: Spark logs location</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126392#M89123</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2348/divakarreddya.html" nodeid="2348"&gt;@Divakar Annapureddy&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;I doubt log4j will work with hdfs.&lt;/P&gt;&lt;P&gt;Try setting the file location on native linux path, something like /var/log/spark/spark.log&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jul 2016 13:26:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126392#M89123</guid>
      <dc:creator>rpathak</dc:creator>
      <dc:date>2016-07-13T13:26:53Z</dc:date>
    </item>
    <item>
      <title>Re: Spark logs location</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126393#M89124</link>
      <description>&lt;P&gt;Thanks for this..I tried it earlier but it's not creating any logs here. I'm seeing only .OUT files.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jul 2016 13:30:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126393#M89124</guid>
      <dc:creator>divakarreddy_a</dc:creator>
      <dc:date>2016-07-13T13:30:00Z</dc:date>
    </item>
    <item>
      <title>Re: Spark logs location</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126394#M89125</link>
      <description>&lt;P&gt;I looked at "yarn.nodemanager.log-dirs" in YARN but it seems YARN will clear all the logs immediately after completion of the job.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jul 2016 13:35:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126394#M89125</guid>
      <dc:creator>divakarreddy_a</dc:creator>
      <dc:date>2016-07-13T13:35:22Z</dc:date>
    </item>
    <item>
      <title>Re: Spark logs location</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126395#M89126</link>
      <description>&lt;P&gt;I have added below properties in advanced log4j properties and spark is creating logs in local directory.&lt;/P&gt;&lt;P&gt;log4j.appender.rolling=org.apache.log4j.RollingFileAppender &lt;/P&gt;&lt;P&gt;log4j.appender.rolling.encoding=UTF-8 &lt;/P&gt;&lt;P&gt;log4j.appender.rolling.layout=org.apache.log4j.PatternLayout &lt;/P&gt;&lt;P&gt;log4j.appender.rolling.layout.conversionPattern=[%d] %p %m (%c)%n &lt;/P&gt;&lt;P&gt;log4j.appender.rolling.maxBackupIndex=5 &lt;/P&gt;&lt;P&gt;log4j.appender.rolling.maxFileSize=50MB &lt;/P&gt;&lt;P&gt;log4j.logger.org.apache.spark=WARN&lt;/P&gt;&lt;P&gt;log4j.logger.org.eclipse.jetty=WARN
log4j.rootLogger=INFO, rolling
#log4j.appender.rolling.file=${spark.yarn.app.container.log.dir}/spark.log &lt;/P&gt;&lt;P&gt;log4j.appender.rolling.file=/var/log/spark/spark.log&lt;/P&gt;&lt;P&gt;${spark.yarn.app.container.log.dir}/spark.log doesn't work for me to write logs in HDFS.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jul 2016 16:01:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-logs-location/m-p/126395#M89126</guid>
      <dc:creator>divakarreddy_a</dc:creator>
      <dc:date>2016-07-13T16:01:28Z</dc:date>
    </item>
  </channel>
</rss>

