Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Config log4j in Spark Yarn cluster

avatar
New Contributor

I already read other thread but the log just doesn't come out

like this

https://community.cloudera.com/t5/Support-Questions/Config-log4j-in-Spark/m-p/34968

I use the same command but doesn't work

I'm using the CDH 5.16

 

log4j:


log4j.rootCategory=INFO, console,FILE
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=WARN

# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=WARN
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=WARN
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

log4j.appender.FILE = org.apache.log4j.RollingFileAppender
log4j.appender.FILE.Threshold=DEBUG
#log4j.appender.FILE.File=./log/rzp.log
log4j.appender.FILE.File=/home/rzpt/logs/spark.log
log4j.appender.logFile.Encoding = UTF-8
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=[%-5p] [%d{yyyy-MM-dd HH:mm:ss}] [%C{1}:%M:%L] %m%n
log4j.appender.FILE.MaxFileSize=10mb

I want the log output in here /home/rzpt/logs/spark.log

my log4j.properties in this directory :src/main/resource/log4j.properties,and I put the same log4j.properties on the server

I try this command ,all of them doesnt work

 


spark2-submit \
--class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain \
--master yarn \
--deploy-mode cluster \
--files "./log4j.properties" \
./sgp-1.0.jar

spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --master local[*] --deploy-mode client sgp-1.0.jar

 

spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files "./log4j.properties" --driver-java-options "-Dlog4j.debug=true -Dlog4j.configuration=log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.debug=true -Dlog4j.configuration=log4j.properties" --master yarn --deploy-mode cluster sgp-1.0.jar

 

spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files "./log4j.properties" --driver-java-options "-Dlog4j.debug=true -Dlog4j.configuration=log4j.properties" --conf spark.driver.extraJavaOptions='-Dlog4j.configuration=file:/opt/centralLogs/conf/log4j.properties' --master yarn --deploy-mode cluster sgp-1.0.jar

 

1 REPLY 1

avatar
Expert Contributor

Hello @renzhongpei ,

 

From the log4j properties file I see you are trying to write the logs in local file path [ log4j.appender.FILE.File=/home/rzpt/logs/spark.log ] 

 

Please note that, with the above log4j properties,  the executors & Driver (in cluster mode) basically tries to write log files on above specified path on all the nodes where containers (executor) runs

 

If your requirement is such, you would need to follow command like this (assuming the log4j.properties file in your local /tmp path on the node where you execute spark2-submit)

 

spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files /tmp/log4j.properties --conf spark.driver.extraJavaOptions="-Dlog4j.configuration=log4j.properties" --conf "spark.executor.extraJavaOptions="-Dlog4j.configuration=log4j.properties" --master yarn --deploy-mode cluster sgp-1.0.jar

 

Note that in above command's "-Dlog4j.configuration=log4j.properties" you can use as it is (i.e) you don't need to give the explicit local path such as file:// . since the executor would automatically pickup the log4j.properties from the container localised path

Thanks,
Satz