Created 01-04-2019 05:50 AM
My namenode disk is filling up due to below audit logs.
Can we delete below logs file.
-rw-r--r--. 1 hdfs hadoop 2.2G Dec 20 23:59 hdfs-audit.log.2018-12-20 -rw-r--r--. 1 hdfs hadoop 7.2G Dec 21 23:59 hdfs-audit.log.2018-12-21 -rw-r--r--. 1 hdfs hadoop 7.3G Dec 22 23:59 hdfs-audit.log.2018-12-22 -rw-r--r--. 1 hdfs hadoop 7.1G Dec 23 23:59 hdfs-audit.log.2018-12-23 -rw-r--r--. 1 hdfs hadoop 17G Dec 24 23:59 hdfs-audit.log.2018-12-24 -rw-r--r--. 1 hdfs hadoop 23G Dec 25 23:59 hdfs-audit.log.2018-12-25
Created 01-04-2019 05:58 AM
If you have disk space issues then you can delete the old hdfs-audit.log. However for autiting purpose (like who did what on hdfs) this log is useful.
If you want to set a particular size limit for your hdfs-audit log then the best option will be to change the Logger named "DRFAAUDIT" isnide "Advanced hdfs-log4j" section of ambari to "RollingFileAppender" instead if using the default "DailyRollingFileAppender"
Ambari UI --> HDFS --> Config --> Advanced --> Advanced hdfs-log4j
Edit the hdfs-log4j template section containing "DRFAAUDIT" appender definition to somethng like folowing:
hdfs.audit.logger=INFO,console log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=${hdfs.audit.logger} log4j.additivity.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=false log4j.appender.DRFAAUDIT=org.apache.log4j.RollingFileAppender log4j.appender.DRFAAUDIT.File=${hadoop.log.dir}/hdfs-audit.log log4j.appender.DRFAAUDIT.layout=org.apache.log4j.PatternLayout log4j.appender.DRFAAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n log4j.appender.DRFAAUDIT.MaxFileSize=500MB log4j.appender.DRFAAUDIT.MaxBackupIndex=20
Restart HDFS.
.
.
As an alternate approach you can refer to the following article to get the hdfs audit logs automatically gets compressed:
How to enable HDFS Audit log rotation and zipping for logs? https://community.hortonworks.com/content/supportkb/150088/how-to-enable-hdfs-audit-log-rotation-and...
.
Created 01-04-2019 06:00 AM