Support Questions

Find answers, ask questions, and share your expertise

change the Advanced spark2-log4j-propertiese for debug mode

avatar

we have the following Advanced spark2-log4j-propertiese , from ambari spark2 --> config -->

# Set everything to be logged to the console
log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.spark.metrics.MetricsConfig=DEBUG
log4j.logger.org.apache.spark.deploy.yarn.Client=DEBUG

how to change the current log4 for debug mode ?

Michael-Bronson
8 REPLIES 8

avatar

for example I change all to ALL ( latest ) , to get the most details in spark logs and then restart the spark , but I not see that logs gives more data


# Set everything to be logged to the console log4j.rootCategory=ALL, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n # Settings to quiet third party logs that are too verbose log4j.logger.org.eclipse.jetty=ALL log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ALL log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ALL log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ALL log4j.logger.org.apache.spark.metrics.MetricsConfig=ALL log4j.logger.org.apache.spark.deploy.yarn.Client=ALL
Michael-Bronson

avatar

Logging Levels

The valid logging levels are log4j’s Levels (from most specific to least):

  • OFF (most specific, no logging)
  • FATAL (most specific, little data)
  • ERROR
  • WARN
  • INFO
  • DEBUG
  • TRACE (least specific, a lot of data)
  • ALL (least specific, all data)
Michael-Bronson

avatar

avatar

no this isnt help , under master machines /var/log/spark2 and datanode machine under /var/log/spark2 we not see any changes of the log

Michael-Bronson

avatar
@Michael Bronson

Use below :

  1. # Set everything to be logged to the console
  2. log4j.rootCategory=DBEUG, console
  3. log4j.appender.console=org.apache.log4j.ConsoleAppender
  4. log4j.appender.console.target=System.err
  5. log4j.appender.console.layout=org.apache.log4j.PatternLayout
  6. log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss}%p %c{1}:%m%n
  7. # Settings to quiet third party logs that are too verbose
  8. log4j.logger.org.eclipse.jetty=WARN
  9. log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
  10. log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
  11. log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
  12. log4j.logger.org.apache.spark.metrics.MetricsConfig=DEBUG
  13. log4j.logger.org.apache.spark.deploy.yarn.Client=DEBUG

avatar

still not see any change for debug in the log under /var/log/spark2 ,

Michael-Bronson

avatar

@Michael Bronson, You will see Spark thrift server and Spark History Server in /var/log/spark2. The above log4j i proposed is for spark applications (which will not be stored in /var/log/spark2 rather you should use yarn logs command and extract the log).

What is it you want to enable DEBUG logging for? spark application , Spark Thrift server or Spark History Server?

avatar

we need the debug for Spark Thrift server , we have issue when heartbeat from datanode machine not communicated with the driver , so this is the reason that we need debug mode on for Spark Thrift server

Michael-Bronson