<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark job submit log messages on console in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-submit-log-messages-on-console/m-p/163047#M125421</link>
    <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/10930/scarroll.html" nodeid="10930"&gt;@Sebastian Carroll&lt;/A&gt; These options will work in both yarn-client and yarn-cluster mode. What you will need to do is ensure you have an appropriate file appender in the log4j configuration. &lt;/P&gt;&lt;P&gt;That said, if you have a job which is running for multiple days, you are far far better off using yarn-cluster mode to ensure the driver is safely located on the cluster, rather than relying on a single node with a yarn-client hooked to it.&lt;/P&gt;</description>
    <pubDate>Mon, 20 Feb 2017 18:49:10 GMT</pubDate>
    <dc:creator>sball</dc:creator>
    <dc:date>2017-02-20T18:49:10Z</dc:date>
  </channel>
</rss>

