New Contributor
Posts: 5
Registered: ‎06-17-2015
Accepted Solution

Config log4j in Spark

I have read the others threads about this topic but I don't get it to work.


I'm using Cloudera 5.4.8 with Spark 1.3.0 and create a 

log4j.rootCategory=DEBUG, RollingAppender, myConsoleAppender

log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c - %m%n

log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M - %m%n


When I execute my code and I have used the flag --files with the location of the

spark-submit --name "CentralLog" --master yarn-client  --class example.spark.CentralLog --files /opt/centralLogs/conf/ --jars $SPARK_CLASSPATH --executor-memory 2g  /opt/centralLogs/libProject/produban-paas.jar 


I have developed a small code in Scala to Spark where I use log4j. I am tracing some log.error and log.debug. I can see the log.error but not the log.debug. 

I guess that if I use --files I use the same for driver and executor.


Does someone have a clue about what it could be wrong?

Posts: 1,903
Kudos: 436
Solutions: 307
Registered: ‎07-31-2013

Re: Config log4j in Spark

> I guess that if I use --files I use the same for driver and executor.


Where are you expecting your logs to be visible BTW? At the driver, or within the executors? Since you are using the yarn-client mode, the custom logger passed via --file will be applied only to the executors.


If you'd like it applied to the driver also, via just the use of --file, you will need to use the yarn-cluster mode, as so:


spark-submit --name "CentralLog" --master yarn-cluster  --class example.spark.CentralLog --files /opt/centralLogs/conf/ --jars $SPARK_CLASSPATH --executor-memory 2g  /opt/centralLogs/libProject/produban-paas.jar 


Otherwise, additonally pass an explicit -Dlog4j.configuration=file:/opt/centralLogs/conf/ through spark.driver.extraJavaOptions to make it work, as so:


spark-submit --name "CentralLog" --master yarn-client  --class example.spark.CentralLog --files /opt/centralLogs/conf/ --conf spark.driver.extraJavaOptions='-Dlog4j.configuration=file:/opt/centralLogs/conf/' --jars $SPARK_CLASSPATH --executor-memory 2g  /opt/centralLogs/libProject/produban-paas.jar 

New Contributor
Posts: 1
Registered: ‎08-15-2016

Re: Config log4j in Spark

[ Edited ]

I have 5 spark applications and I want to have 5 different spark application logs. How can this be acheived ?

Cloudera Employee
Posts: 20
Registered: ‎01-17-2017

Re: Config log4j in Spark

Spark properties control most application parameters and can be set by using a SparkConf object, or through Java system properties.
Environment variables can be used to set per-machine settings, such as the IP address, through the conf/ script on each node.
Logging can be configured through

Posts: 6
Registered: ‎11-02-2018

Re: Config log4j in Spark



I am trying to use the custom log4j to gather Spark driver logs( submitting jobs under CLUSTER mode), but unable to achieve it.

Here is my custom file content:


#Below is the unix server path from where job is getting submitted

log4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n


And cmnd to submit job:

spark2-submit --files /apps/test/config/ --conf "" --master yarn --deploy-mode cluster --num-executors 2 --executor-cores 4 --driver-memory 1g --executor-memory 16g --keytab XXXXX.keytab --principal XXXXX --class com.test.spark.par_1_submit par_submit.jar


Error I'm getting: /some/path/to/edgeNode/SparkDriver.log (No such file or directory)