Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

How to make log files rotate based on size using log4j.properties and also zipped

hi all

we configure the HIVE , and log4j with RollingFileAppender

log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB

full details:

# Define some default values that can be overridden by system properties
hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB
# 30-day backup
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n

the logs from the machine are:

-rw-r--r--  1 hive hadoop    1113 Aug 15 13:10 hivemetastore.log.10
-rw-r--r--  1 hive hadoop    1028 Aug 15 13:10 hivemetastore.log.9
-rw-r--r--  1 hive hadoop    1070 Aug 15 13:11 hivemetastore.log.8
-rw-r--r--  1 hive hadoop    1239 Aug 15 13:12 hiveserver2.log.10
-rw-r--r--  1 hive hadoop    1154 Aug 15 13:13 hivemetastore.log.7
-rw-r--r--  1 hive hadoop    1133 Aug 15 13:13 hivemetastore.log.6
-rw-r--r--  1 hive hadoop    1055 Aug 15 13:15 hiveserver2.log.9
-rw-r--r--  1 hive hadoop    1203 Aug 15 13:15 hiveserver2.log.8
-rw-r--r--  1 hive hadoop    1098 Aug 15 13:15 hiveserver2.log.7
-rw-r--r--  1 hive hadoop    1028 Aug 15 13:15 hiveserver2.log.6
-rw-r--r--  1 hive hadoop    1239 Aug 15 13:15 hiveserver2.log.5
-rw-r--r--  1 hive hadoop    1113 Aug 15 13:16 hivemetastore.log.5
-rw-r--r--  1 hive hadoop    1028 Aug 15 13:16 hivemetastore.log.4
-rw-r--r--  1 hive hadoop    1070 Aug 15 13:16 hivemetastore.log.3
-rw-r--r--  1 hive hadoop    1048 Aug 15 13:18 hiveserver2.log.4
-rw-r--r--  1 hive hadoop    1173 Aug 15 13:18 hiveserver2.log.3
-rw-r--r--  1 hive hadoop    1157 Aug 15 13:18 hiveserver2.log.2
-rw-r--r--  1 hive hadoop    1239 Aug 15 13:18 hiveserver2.log.1
-rw-r--r--  1 hive hadoop     503 Aug 15 13:18 hiveserver2.log
-rw-r--r--  1 hive hadoop    1154 Aug 15 13:19 hivemetastore.log.2
-rw-r--r--  1 hive hadoop    1133 Aug 15 13:19 hivemetastore.log.1
-rw-r--r--  1 hive hadoop     292 Aug 15 13:19 hivemetastore.log
-rw-r--r--  1 hive hadoop    4904 Aug 15 13:20 hivemetastore-report.json.tmp
-rw-r--r--  1 hive hadoop    4273 Aug 15 13:20 hiveserver2-report.json.tmp

my question is - all roteated logs as hiveserver2.log.1 , hiveserver2.log.2 , etc

are not zipped log

what is the change in log4j that I need to do in order to ziped the roteated fuiles?

Michael-Bronson
1 ACCEPTED SOLUTION

Please use below config for size based rotation.

log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
log4j.appender.DRFA=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.DRFA.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy  
log4j.appender.DRFA.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy  
log4j.appender.DRFA.rollingPolicy.ActiveFileName =${hive.log.dir}/${hive.log.file}.log
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.gz
log4j.appender.DRFA.triggeringPolicy.MaxFileSize=10000
log4j.appender.DRFA.rollingPolicy.maxIndex=10

I have tried this and it works perfectly fine.

For some reason maxfilesize is not working if used like 1MB like that - so use live above.

Note: please check hive-server2.err for any WARNINGS if does not work

View solution in original post

12 REPLIES 12

Super Mentor

@Michael Bronson

Please define a "rollingPolicy" inside the RollingFileAppender something as described in the following article:

https://community.hortonworks.com/articles/50058/using-log4j-extras-how-to-rotate-as-well-as-zip-th....

Create a new appender like ZIPRFA like the mentioned article

log4j.appender.XXXXXX.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%d{yyyyMMdd}.log.gz

.

hi Jay

we configured the follwing , but still rotated files are not ziped

whart is wrong in my log4j?

hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%d{yyyyMMdd}.log.gz
Michael-Bronson

I also configured the follwing ( without date ) but without zip alos

log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.zip
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB
Michael-Bronson

@Jay can yoiu help me regarding my last notes ?

Michael-Bronson

Please use below config for size based rotation.

log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
log4j.appender.DRFA=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.DRFA.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy  
log4j.appender.DRFA.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy  
log4j.appender.DRFA.rollingPolicy.ActiveFileName =${hive.log.dir}/${hive.log.file}.log
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.gz
log4j.appender.DRFA.triggeringPolicy.MaxFileSize=10000
log4j.appender.DRFA.rollingPolicy.maxIndex=10

I have tried this and it works perfectly fine.

For some reason maxfilesize is not working if used like 1MB like that - so use live above.

Note: please check hive-server2.err for any WARNINGS if does not work

do you mean that I need to replace my lines as

log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.zip
log4j.appender.DRFA.MaxBackupIndex=10 
log4j.appender.DRFA.MaxFileSize=1KB

to

  1. log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
  2. log4j.appender.DRFA=org.apache.log4j.rolling.RollingFileAppender
  3. log4j.appender.DRFA.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy
  4. log4j.appender.DRFA.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
  5. log4j.appender.DRFA.rollingPolicy.ActiveFileName=${hive.log.dir}/${hive.log.file}.log
  6. log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.gz
  7. log4j.appender.DRFA.triggeringPolicy.MaxFileSize=10000
  8. log4j.appender.DRFA.rollingPolicy.maxIndex=10

?

    Michael-Bronson

    @Michael Bronson yes that is correct - please see attached my log4j configuration in ambari.log4j.txt

    second

    my ambari is based on linux machines cluster

    so

    what is the line:

    1. log4j.appender.DRFA.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy
    Michael-Bronson

    @Michael Bronson see my above comment - "FixedWindowRollingPolicy" (its not windows its window)is policy name - its nothing to do with OS

    ok , let me try to set it on my ambari , I will update ....

    Michael-Bronson

    this are the new logs that I get now


    -rw-r--r-- 1 hive hadoop 389 Aug 15 19:28 hivemetastore.log-.10.log.gz -rw-r--r--. 1 hive hadoop 0 Aug 15 19:29 hive.err -rw-r--r--. 1 hive hadoop 31 Aug 15 19:29 hive.out -rw-r--r--. 1 hive hadoop 0 Aug 15 19:29 hive-server2.out -rw-r--r--. 1 hive hadoop 0 Aug 15 19:29 hive-server2.err -rw-r--r-- 1 hive hadoop 1719 Aug 15 19:34 hivemetastore.log-.9.log.gz -rw-r--r-- 1 hive hadoop 4357 Aug 15 19:34 hiveserver2-report.json -rw-r--r-- 1 hive hadoop 4315 Aug 15 19:34 hivemetastore-report.json -rw-r--r-- 1 hive hadoop 1240 Aug 15 19:42 hiveserver2.log-.10.log.gz -rw-r--r-- 1 hive hadoop 1038 Aug 15 19:43 hivemetastore.log-.8.log.gz -rw-r--r-- 1 hive hadoop 1159 Aug 15 19:48 hiveserver2.log-.9.log.gz -rw-r--r-- 1 hive hadoop 1038 Aug 15 19:49 hivemetastore.log-.7.log.gz -rw-r--r-- 1 hive hadoop 1150 Aug 15 19:54 hiveserver2.log-.8.log.gz -rw-r--r-- 1 hive hadoop 1019 Aug 15 19:58 hivemetastore.log-.6.log.gz -rw-r--r-- 1 hive hadoop 1168 Aug 15 20:00 hiveserver2.log-.7.log.gz -rw-r--r-- 1 hive hadoop 1174 Aug 15 20:06 hiveserver2.log-.6.log.gz -rw-r--r-- 1 hive hadoop 1016 Aug 15 20:07 hivemetastore.log-.5.log.gz -rw-r--r-- 1 hive hadoop 1156 Aug 15 20:12 hiveserver2.log-.5.log.gz -rw-r--r-- 1 hive hadoop 1019 Aug 15 20:13 hivemetastore.log-.4.log.gz -rw-r--r-- 1 hive hadoop 1159 Aug 15 20:18 hiveserver2.log-.4.log.gz -rw-r--r-- 1 hive hadoop 1028 Aug 15 20:22 hivemetastore.log-.3.log.gz -rw-r--r-- 1 hive hadoop 1171 Aug 15 20:24 hiveserver2.log-.3.log.gz -rw-r--r-- 1 hive hadoop 1167 Aug 15 20:30 hiveserver2.log-.2.log.gz -rw-r--r-- 1 hive hadoop 1017 Aug 15 20:31 hivemetastore.log-.2.log.gz -rw-r--r-- 1 hive hadoop 233 Aug 15 20:36 hiveserver2.log -rw-r--r-- 1 hive hadoop 1166 Aug 15 20:36 hiveserver2.log-.1.log.gz -rw-r--r-- 1 hive hadoop 1003 Aug 15 20:37 hivemetastore.log-.1.log.gz -rw-r--r-- 1 hive hadoop 1218 Aug 15 20:37 hivemetastore.log -rw-r--r-- 1 hive hadoop 4248 Aug 15 20:39 hivemetastore-report.json.tmp -rw-r--r-- 1 hive hadoop 4363 Aug 15 20:39 hiveserver2-report.json.tmp [root@master01 hive]# [root@master01 hive]# [root@master01 hive]# [root@master01 hive]# du -sh * 0 hive.err 4.0K hivemetastore.log 4.0K hivemetastore.log-.10.log.gz 4.0K hivemetastore.log-.1.log.gz 4.0K hivemetastore.log-.2.log.gz 4.0K hivemetastore.log-.3.log.gz 4.0K hivemetastore.log-.4.log.gz 4.0K hivemetastore.log-.5.log.gz 4.0K hivemetastore.log-.6.log.gz 4.0K hivemetastore.log-.7.log.gz 4.0K hivemetastore.log-.8.log.gz 4.0K hivemetastore.log-.9.log.gz 8.0K hivemetastore-report.json 8.0K hivemetastore-report.json.tmp 4.0K hive.out 0 hive-server2.err 4.0K hiveserver2.log 4.0K hiveserver2.log-.10.log.gz 4.0K hiveserver2.log-.1.log.gz 4.0K hiveserver2.log-.2.log.gz 4.0K hiveserver2.log-.3.log.gz 4.0K hiveserver2.log-.4.log.gz 4.0K hiveserver2.log-.5.log.gz 4.0K hiveserver2.log-.6.log.gz 4.0K hiveserver2.log-.7.log.gz 4.0K hiveserver2.log-.8.log.gz 4.0K hiveserver2.log-.9.log.gz 0 hive-server2.out 8.0K hiveserver2-report.json 8.0K hiveserver2-report.json.tmp
    Michael-Bronson

    @Michael Bronson

    thats good - please mark the correct answer.