<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Config log4j in Spark in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/34968#M11673</link>
    <description>&lt;P&gt;I have read the others threads about this topic but I don't get it to work.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm using Cloudera 5.4.8 with Spark 1.3.0 and create a log4j.properties&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;log4j.rootCategory=DEBUG, RollingAppender, myConsoleAppender&lt;BR /&gt;log4j.logger.example.spark=debug&lt;/P&gt;
&lt;P&gt;log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender&lt;BR /&gt;log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout&lt;BR /&gt;log4j.appender.myConsoleAppender.Target=System.out&lt;BR /&gt;log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c - %m%n&lt;BR /&gt;&lt;BR /&gt;log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender&lt;BR /&gt;log4j.appender.RollingAppender.File=/opt/centralLogs/log/spark.log&lt;BR /&gt;#log4j.appender.RollingAppender.File=/opt/centralLogs/log/spark.log&lt;BR /&gt;log4j.appender.RollingAppender.DatePattern='.'yyyy-MM-dd&lt;BR /&gt;log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout&lt;BR /&gt;log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M - %m%n&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;When I execute my code and I have used the flag --files with the location of the log4j.properties&lt;/P&gt;
&lt;P&gt;spark-submit --name "CentralLog" --master yarn-client &amp;nbsp;--class example.spark.CentralLog &lt;STRONG&gt;--files /opt/centralLogs/conf/log4j.properties#log4j.properties&lt;/STRONG&gt; --jars $SPARK_CLASSPATH --executor-memory 2g &amp;nbsp;/opt/centralLogs/libProject/produban-paas.jar&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have developed a small code in Scala to Spark where I use log4j. I am tracing some log.error and log.debug. I can see the log.error but not the log.debug.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I guess that if I use --files I use the same log4j.properties for driver and executor.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Does someone have a clue about what it could be wrong?&lt;/P&gt;</description>
    <pubDate>Fri, 25 Oct 2019 16:26:18 GMT</pubDate>
    <dc:creator>guillermoOrtiz</dc:creator>
    <dc:date>2019-10-25T16:26:18Z</dc:date>
    <item>
      <title>Config log4j in Spark</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/34968#M11673</link>
      <description>&lt;P&gt;I have read the others threads about this topic but I don't get it to work.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm using Cloudera 5.4.8 with Spark 1.3.0 and create a log4j.properties&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;log4j.rootCategory=DEBUG, RollingAppender, myConsoleAppender&lt;BR /&gt;log4j.logger.example.spark=debug&lt;/P&gt;
&lt;P&gt;log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender&lt;BR /&gt;log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout&lt;BR /&gt;log4j.appender.myConsoleAppender.Target=System.out&lt;BR /&gt;log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c - %m%n&lt;BR /&gt;&lt;BR /&gt;log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender&lt;BR /&gt;log4j.appender.RollingAppender.File=/opt/centralLogs/log/spark.log&lt;BR /&gt;#log4j.appender.RollingAppender.File=/opt/centralLogs/log/spark.log&lt;BR /&gt;log4j.appender.RollingAppender.DatePattern='.'yyyy-MM-dd&lt;BR /&gt;log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout&lt;BR /&gt;log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M - %m%n&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;When I execute my code and I have used the flag --files with the location of the log4j.properties&lt;/P&gt;
&lt;P&gt;spark-submit --name "CentralLog" --master yarn-client &amp;nbsp;--class example.spark.CentralLog &lt;STRONG&gt;--files /opt/centralLogs/conf/log4j.properties#log4j.properties&lt;/STRONG&gt; --jars $SPARK_CLASSPATH --executor-memory 2g &amp;nbsp;/opt/centralLogs/libProject/produban-paas.jar&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have developed a small code in Scala to Spark where I use log4j. I am tracing some log.error and log.debug. I can see the log.error but not the log.debug.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I guess that if I use --files I use the same log4j.properties for driver and executor.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Does someone have a clue about what it could be wrong?&lt;/P&gt;</description>
      <pubDate>Fri, 25 Oct 2019 16:26:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/34968#M11673</guid>
      <dc:creator>guillermoOrtiz</dc:creator>
      <dc:date>2019-10-25T16:26:18Z</dc:date>
    </item>
    <item>
      <title>Re: Config log4j in Spark</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/35124#M11674</link>
      <description>&lt;P&gt;&amp;gt; &lt;EM&gt;I guess that if I use --files I use the same log4j.properties for driver and executor.&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Where are you expecting your logs to be visible BTW? At the &lt;STRONG&gt;driver&lt;/STRONG&gt;, or within the &lt;STRONG&gt;executors&lt;/STRONG&gt;?&amp;nbsp;Since you are using the &lt;STRONG&gt;yarn-client&lt;/STRONG&gt; mode, the custom logger passed via &lt;STRONG&gt;--file&lt;/STRONG&gt; will be applied only to the executors.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you'd like it applied to the driver also, via just the use of &lt;STRONG&gt;--file&lt;/STRONG&gt;, you will need to use the &lt;STRONG&gt;yarn-cluster&lt;/STRONG&gt; mode, as so:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="andale mono,times"&gt;spark-submit --name "CentralLog" &lt;STRONG&gt;--master yarn-cluster&lt;/STRONG&gt; &amp;nbsp;--class example.spark.CentralLog&amp;nbsp;&lt;STRONG&gt;--files /opt/centralLogs/conf/log4j.properties#log4j.properties&lt;/STRONG&gt;&amp;nbsp;--jars $SPARK_CLASSPATH --executor-memory 2g &amp;nbsp;/opt/centralLogs/libProject/produban-paas.jar&amp;nbsp;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Otherwise, additonally&amp;nbsp;pass an explicit &lt;FONT face="andale mono,times"&gt;&lt;STRONG&gt;-Dlog4j.configuration=file:/opt/centralLogs/conf/log4j.properties&lt;/STRONG&gt;&lt;/FONT&gt; through &lt;FONT face="andale mono,times"&gt;&lt;STRONG&gt;spark.driver.extraJavaOptions&lt;/STRONG&gt;&lt;/FONT&gt; to make it work, as so:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="andale mono,times"&gt;spark-submit --name "CentralLog" --master yarn-client &amp;nbsp;--class example.spark.CentralLog&amp;nbsp;&lt;STRONG&gt;--files /opt/centralLogs/conf/log4j.properties#log4j.properties&lt;/STRONG&gt;&amp;nbsp;--conf &lt;STRONG&gt;spark.driver.extraJavaOptions='-Dlog4j.configuration=file:/opt/centralLogs/conf/log4j.properties'&lt;/STRONG&gt;&amp;nbsp;--jars $SPARK_CLASSPATH --executor-memory 2g &amp;nbsp;/opt/centralLogs/libProject/produban-paas.jar&amp;nbsp;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 12 Dec 2015 18:29:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/35124#M11674</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2015-12-12T18:29:55Z</dc:date>
    </item>
    <item>
      <title>Re: Config log4j in Spark</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/43997#M11675</link>
      <description>&lt;P&gt;I have 5 spark applications and I want to have 5 different spark application logs. How can this be acheived ?&lt;/P&gt;</description>
      <pubDate>Tue, 16 Aug 2016 04:45:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/43997#M11675</guid>
      <dc:creator>Lijju</dc:creator>
      <dc:date>2016-08-16T04:45:22Z</dc:date>
    </item>
    <item>
      <title>Re: Config log4j in Spark</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/49575#M11676</link>
      <description>&lt;P&gt;Spark properties control most application parameters and can be set by using a SparkConf object, or through Java system properties.&lt;BR /&gt;Environment variables can be used to set per-machine settings, such as the IP address, through the conf/spark-env.sh script on each node.&lt;BR /&gt;Logging can be configured through log4j.properties.&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jan 2017 01:41:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/49575#M11676</guid>
      <dc:creator>ZachRoes</dc:creator>
      <dc:date>2017-01-18T01:41:03Z</dc:date>
    </item>
    <item>
      <title>Re: Config log4j in Spark</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/91285#M11677</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to use the custom log4j to gather Spark driver logs( submitting jobs under CLUSTER mode), but unable to achieve it.&lt;/P&gt;&lt;P&gt;Here is my custom log4j.properties file content:&lt;/P&gt;&lt;P&gt;log4j.rootCategory=ALL,FILE&lt;BR /&gt;log4j.appender.FILE=org.apache.log4j.RollingFileAppender&lt;/P&gt;&lt;P&gt;#Below is the unix server path from where job is getting submitted&lt;BR /&gt;log4j.appender.FILE.File=/some/path/to/edgeNode/SparkDriver.log&lt;/P&gt;&lt;P&gt;log4j.appender.FILE.Append=false&lt;BR /&gt;log4j.appender.FILE.layout=org.apache.log4j.PatternLayout&lt;BR /&gt;log4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And cmnd to submit job:&lt;/P&gt;&lt;P&gt;spark2-submit --files /apps/test/config/driver_log4j.properties --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=driver_log4j.properties" --master yarn --deploy-mode cluster --num-executors 2 --executor-cores 4 --driver-memory 1g --executor-memory 16g --keytab XXXXX.keytab --principal XXXXX --class com.test.spark.par_1_submit par_submit.jar&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Error I'm getting:&lt;/P&gt;&lt;PRE&gt;java.io.FileNotFoundException: /some/path/to/edgeNode/SparkDriver.log (No such file or directory)&lt;/PRE&gt;</description>
      <pubDate>Thu, 06 Jun 2019 11:01:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/91285#M11677</guid>
      <dc:creator>akv31</dc:creator>
      <dc:date>2019-06-06T11:01:26Z</dc:date>
    </item>
    <item>
      <title>Re: Config log4j in Spark</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/281326#M11678</link>
      <description>&lt;P&gt;The application might be expecting the log folder to be there in order to generate logs in it.&lt;/P&gt;&lt;P&gt;Seems like your problem can be solved by creating the folder in the driver node:&lt;/P&gt;&lt;P&gt;/some/path/to/edgeNode/&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I hope you also know that you have mentioned the log4j file only for driver program. In order for executors to generate logs you may need to specify the following option in spark-submit&lt;BR /&gt;&lt;SPAN&gt;"spark.executor.extraJavaOptions=-Dlog4j.configuration=driver_log4j.properties"&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 25 Oct 2019 05:43:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Config-log4j-in-Spark/m-p/281326#M11678</guid>
      <dc:creator>sparkr</dc:creator>
      <dc:date>2019-10-25T05:43:40Z</dc:date>
    </item>
  </channel>
</rss>

