Support Questions
Find answers, ask questions, and share your expertise

Config log4j in Spark Yarn cluster

New Contributor

I already read other thread but the log just doesn't come out

like this

I use the same command but doesn't work

I'm using the CDH 5.16



log4j.rootCategory=INFO, console,FILE
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.

# Settings to quiet third party logs that are too verbose$exprTyper=WARN$SparkILoopInterpreter=WARN

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support

log4j.appender.FILE = org.apache.log4j.RollingFileAppender
log4j.appender.logFile.Encoding = UTF-8
log4j.appender.FILE.layout.ConversionPattern=[%-5p] [%d{yyyy-MM-dd HH:mm:ss}] [%C{1}:%M:%L] %m%n

I want the log output in here /home/rzpt/logs/spark.log

my in this directory :src/main/resource/,and I put the same on the server

I try this command ,all of them doesnt work


spark2-submit \
--class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain \
--master yarn \
--deploy-mode cluster \
--files "./" \

spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --master local[*] --deploy-mode client sgp-1.0.jar


spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files "./" --driver-java-options "-Dlog4j.debug=true" --conf "spark.executor.extraJavaOptions=-Dlog4j.debug=true" --master yarn --deploy-mode cluster sgp-1.0.jar


spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files "./" --driver-java-options "-Dlog4j.debug=true" --conf spark.driver.extraJavaOptions='-Dlog4j.configuration=file:/opt/centralLogs/conf/' --master yarn --deploy-mode cluster sgp-1.0.jar



Expert Contributor

Hello @renzhongpei ,


From the log4j properties file I see you are trying to write the logs in local file path [ log4j.appender.FILE.File=/home/rzpt/logs/spark.log ] 


Please note that, with the above log4j properties,  the executors & Driver (in cluster mode) basically tries to write log files on above specified path on all the nodes where containers (executor) runs


If your requirement is such, you would need to follow command like this (assuming the file in your local /tmp path on the node where you execute spark2-submit)


spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files /tmp/ --conf spark.driver.extraJavaOptions="" --conf "spark.executor.extraJavaOptions="" --master yarn --deploy-mode cluster sgp-1.0.jar


Note that in above command's "" you can use as it is (i.e) you don't need to give the explicit local path such as file:// . since the executor would automatically pickup the from the container localised path