Support Questions

Find answers, ask questions, and share your expertise

can we delete old hdfs-audit.log file from namenode

My namenode disk is filling up due to below audit logs.

Can we delete below logs file.

-rw-r--r--. 1 hdfs hadoop 2.2G Dec 20 23:59 hdfs-audit.log.2018-12-20

-rw-r--r--. 1 hdfs hadoop 7.2G Dec 21 23:59 hdfs-audit.log.2018-12-21

-rw-r--r--. 1 hdfs hadoop 7.3G Dec 22 23:59 hdfs-audit.log.2018-12-22

-rw-r--r--. 1 hdfs hadoop 7.1G Dec 23 23:59 hdfs-audit.log.2018-12-23

-rw-r--r--. 1 hdfs hadoop  17G Dec 24 23:59 hdfs-audit.log.2018-12-24

-rw-r--r--. 1 hdfs hadoop  23G Dec 25 23:59 hdfs-audit.log.2018-12-25

Super Mentor

@Nitin Suradkar

If you have disk space issues then you can delete the old hdfs-audit.log. However for autiting purpose (like who did what on hdfs) this log is useful.

If you want to set a particular size limit for your hdfs-audit log then the best option will be to change the Logger named "DRFAAUDIT" isnide "Advanced hdfs-log4j" section of ambari to "RollingFileAppender" instead if using the default "DailyRollingFileAppender"

Ambari UI --> HDFS --> Config --> Advanced --> Advanced hdfs-log4j

Edit the hdfs-log4j template section containing "DRFAAUDIT" appender definition to somethng like folowing:

log4j.appender.DRFAAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n

Restart HDFS.



As an alternate approach you can refer to the following article to get the hdfs audit logs automatically gets compressed:

How to enable HDFS Audit log rotation and zipping for logs?


Super Mentor