<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark job submit log messages on console in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-submit-log-messages-on-console/m-p/163046#M125420</link>
    <description>&lt;P&gt;Hi Simmon, &lt;/P&gt;&lt;P&gt;Thanks for the tip. Anyway, I have an additional question : I guess this configuration works in "yarn-cluster" mode (when driver and executors run under yarn responsibility on the cluster nodes) ?&lt;/P&gt;&lt;P&gt;My problem comes from the fact that I perform my spark-submit in "yarn-client" mode, which means that my driver is not managed by yarn, and the consequence is that the logs from the driver application go to the console from the server where I performed my "spark-submit" command. As this is a long-run job (several days), I would like to redirect the driver's logs to dedicated file thanks to log4j configuration, but couldn't succeed with this configuration...? Any idea how to achieve this ?&lt;/P&gt;&lt;P&gt;Thanks again&lt;/P&gt;</description>
    <pubDate>Mon, 20 Feb 2017 18:41:42 GMT</pubDate>
    <dc:creator>schausson</dc:creator>
    <dc:date>2017-02-20T18:41:42Z</dc:date>
  </channel>
</rss>

