Created 02-23-2017 11:47 AM
Created 02-23-2017 11:52 AM
Login to Ambari then navigate to :
Yarn --> Configs --> Advanced --> Advanced yarn-log4j
And then make your desired changes there. Is that what you are looking out for?
Also please take a look at the "Advanced yarn-env" (yarn-env template) where you will see that how Ambari uses the "YARN_ROOT_LOGGER" property to define the
YARN_OPTS="$YARN_OPTS -Dhadoop.root.logger=${YARN_ROOT_LOGGER...........
.
Created 02-23-2017 12:14 PM
Since you have tagged Spark , assuming that you are running a spark app on yarn-cluster mode.
Create a log4j.properties file and set the log level there and use below command.
spark-submit --files file:///home/spark/log4j.properties --driver-java-options "-Dlog4j.configuration=./log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=./log4j.properties" --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 3 --driver-memory 512m --executor-memory 512m --executor-cores 1 /usr/hdp/current/spark-client/lib/spark-examples-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar 10
If not Spark, Please use the solution provided by @Jay SenSharma
Created 02-23-2017 04:33 PM
You could also do in the Spark code:
import org.apache.log4j.{Level, Logger} def main(args: Array[String]) = { Logger.getRootLogger.setLevel(Level.ERROR) var conf = new SparkConf().setAppName("KafkaToHdfs") val sc = new SparkContext(conf)