- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
change the Advanced spark2-log4j-propertiese for debug mode
- Labels:
-
Apache Spark
Created 09-05-2018 01:10 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
we have the following Advanced spark2-log4j-propertiese , from ambari spark2 --> config -->
# Set everything to be logged to the console log4j.rootCategory=WARN, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n # Settings to quiet third party logs that are too verbose log4j.logger.org.eclipse.jetty=WARN log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO log4j.logger.org.apache.spark.metrics.MetricsConfig=DEBUG log4j.logger.org.apache.spark.deploy.yarn.Client=DEBUG
how to change the current log4 for debug mode ?
Created 09-05-2018 02:18 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
for example I change all to ALL ( latest ) , to get the most details in spark logs and then restart the spark , but I not see that logs gives more data
# Set everything to be logged to the console log4j.rootCategory=ALL, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n # Settings to quiet third party logs that are too verbose log4j.logger.org.eclipse.jetty=ALL log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ALL log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ALL log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ALL log4j.logger.org.apache.spark.metrics.MetricsConfig=ALL log4j.logger.org.apache.spark.deploy.yarn.Client=ALL
Created 09-05-2018 02:19 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Logging Levels
The valid logging levels are log4j’s Levels (from most specific to least):
OFF
(most specific, no logging)FATAL
(most specific, little data)ERROR
WARN
INFO
DEBUG
TRACE
(least specific, a lot of data)ALL
(least specific, all data)
Created 09-05-2018 03:30 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Michael Bronson,
Can you please refer to @Sandeep Nemuri's blog of : https://community.hortonworks.com/content/supportkb/150095/how-to-enable-debug-logging-for-spark-his...
and see if that helps you
Created 09-05-2018 03:58 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
no this isnt help , under master machines /var/log/spark2 and datanode machine under /var/log/spark2 we not see any changes of the log
Created 09-05-2018 04:23 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Use below :
- # Set everything to be logged to the console
- log4j.rootCategory=DBEUG, console
- log4j.appender.console=org.apache.log4j.ConsoleAppender
- log4j.appender.console.target=System.err
- log4j.appender.console.layout=org.apache.log4j.PatternLayout
- log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss}%p %c{1}:%m%n
- # Settings to quiet third party logs that are too verbose
- log4j.logger.org.eclipse.jetty=WARN
- log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
- log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
- log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
- log4j.logger.org.apache.spark.metrics.MetricsConfig=DEBUG
- log4j.logger.org.apache.spark.deploy.yarn.Client=DEBUG
Created 09-05-2018 06:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
still not see any change for debug in the log under /var/log/spark2 ,
Created 09-06-2018 08:25 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Michael Bronson, You will see Spark thrift server and Spark History Server in /var/log/spark2. The above log4j i proposed is for spark applications (which will not be stored in /var/log/spark2 rather you should use yarn logs command and extract the log).
What is it you want to enable DEBUG logging for? spark application , Spark Thrift server or Spark History Server?
Created 09-06-2018 10:01 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
we need the debug for Spark Thrift server , we have issue when heartbeat from datanode machine not communicated with the driver , so this is the reason that we need debug mode on for Spark Thrift server