Member since
06-01-2017
5
Posts
0
Kudos Received
0
Solutions
06-13-2017
11:08 AM
Hi @Jay SenSharma Do you have any similar articles for Hive log compression?
... View more
06-07-2017
11:50 AM
Were you able to resolve this issue? @Vikram Chandel
... View more
06-01-2017
11:46 AM
@Rahul Buragohain also could you please explain about rolling.RollingFileAppender?
... View more
06-01-2017
10:38 AM
Thanks @Rahul Buragohain few more clarifications pls 1. If i want to avoid the 30 day limit I should not be using FixedWindowRollingPolicy/max Index and SizeBasedTriggeringPolic/Max file size if am correct? 2. I know am being a bit greedy here, do you know of any other inbuilt process to archive/move the the zipped logs to HDFS/other location instead of deleting them(yes I can write a script , but checking if I am missing on any existing appenders/policies
... View more
06-01-2017
09:57 AM
Can you confirm if the below settings have achieved the purpose in subject?
hdfs.audit.logger=INFO,console log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=${hdfs.audit.logger} log4j.additivity.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=false log4j.appender.DRFAAUDIT=org.apache.log4j.rolling.RollingFileAppender log4j.appender.DRFAAUDIT.File=${hadoop.log.dir}/hdfs-audit.log log4j.appender.DRFAAUDIT.layout=org.apache.log4j.PatternLayout log4j.appender.DRFAAUDIT.layout.ConversionPattern=%d{ISO8601}%p %c:%m% log4j.appender.DRFAAUDIT.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy log4j.appender.DRFAAUDIT.rollingPolicy.maxIndex=30 log4j.appender.DRFAAUDIT.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy log4j.appender.DRFAAUDIT.triggeringPolicy.MaxFileSize=16106127360 ## The figure 16106127360 is in bytes which is equal to 15GB ## log4j.appender.DRFAAUDIT.rollingPolicy.ActiveFileName=${hadoop.log.dir}/hdfs-audit.log log4j.appender.DRFAAUDIT.rollingPolicy.FileNamePattern=${hadoop.log.dir}/hdfs-audit-%i.log.gz
... View more