Options
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Expert Contributor
Created on 07-06-2018 07:50 AM
Put log4j on HDFS path and then use HDFS path in workflow to override.
Sample log4j file:
# # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. See accompanying LICENSE file. # # Define some default values that can be overridden by system properties hadoop.root.logger=DEBUG,CLA # Define the root logger to the system property "hadoop.root.logger". log4j.rootLogger=${hadoop.root.logger}, EventCounter # Logging Threshold log4j.threshold=ALL # # ContainerLog Appender # #Default values yarn.app.container.log.dir=null yarn.app.container.log.filesize=100 log4j.appender.CLA=org.apache.hadoop.yarn.ContainerLogAppender log4j.appender.CLA.containerLogDir=${yarn.app.container.log.dir} log4j.appender.CLA.totalLogFileSize=${yarn.app.container.log.filesize} log4j.appender.CLA.layout=org.apache.log4j.PatternLayout log4j.appender.CLA.layout.ConversionPattern=%d{ISO8601} %p [%t] %c: %m%n log4j.appender.CRLA=org.apache.hadoop.yarn.ContainerRollingLogAppender log4j.appender.CRLA.containerLogDir=${yarn.app.container.log.dir} log4j.appender.CRLA.maximumFileSize=${yarn.app.container.log.filesize} log4j.appender.CRLA.maxBackupIndex=${yarn.app.container.log.backups} log4j.appender.CRLA.layout=org.apache.log4j.PatternLayout log4j.appender.CRLA.layout.ConversionPattern=%d{ISO8601} %p [%t] %c: %m%n # # Event Counter Appender # Sends counts of logging messages at different severity levels to Hadoop Metrics. # log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter
Sample Workflow.xml:
<workflow-app name="javaaction" xmlns="uri:oozie:workflow:0.5"> <global> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> </global> <start to="java-action"/> <kill name="kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <action name="java-action"> <java> <configuration> <property> <name>oozie.launcher.mapreduce.task.classpath.user.precedence</name> <value>true</value> </property> <property> <name>oozie.launcher.mapreduce.user.classpath.first</name> <value>true</value> </property> <property> <name>oozie.launcher.mapred.job.name</name> <value>test</value> </property> <property> <name>oozie.launcher.mapreduce.job.log4j-properties-file</name> <value>${nameNode}/tmp/log4j.properties</value> </property> </configuration> <main-class>WordCount2</main-class> <arg>${nameNode}/tmp/input</arg> <arg>${nameNode}/tmp/output2</arg> </java> <ok to="end"/> <error to="kill"/> </action> <end name="end"/> </workflow-app>