Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Hadoop suppress security logger log level

New Contributor

Hi Team,

I want to change the log level of the Hadoop security logger from INFO to ERROR. I've exported the following environment variable for that


I can see the new parameter has been reflected in the command line

hdfs 13048 2.1 0.7 3899916 509000 ? Sl 01:22 0:29 /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dproc_namenode -Dhdfs.audit.logger=INFO,NullAppender,RFA -Dyarn.log.dir=/var/log/hadoop-hdfs -Dyarn.log.file=hadoop-hdfs-namenode-hadoop-master-1.log -Dyarn.home.dir=/usr/lib/hadoop-yarn -Dyarn.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Xmx2048m -Dhadoop.log.dir=/var/log/hadoop-hdfs -Dhadoop.log.file=hadoop-hdfs-namenode-hadoop-master-1.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.root.logger=ERROR,RFA -Dhadoop.policy.file=hadoop-policy.xml org.apache.hadoop.hdfs.server.namenode.NameNode

But still, I can see the INFO level logs from the security logger in the log file

[Socket Reader #1 for port 9820] INFO - Auth successful for hdfs/hadoop-master-1.hadoop.default.svc.cluster.local@CLUSTER.LOCAL (auth:KERBEROS)
[Socket Reader #1 for port 9820] INFO - Authorization successful for hdfs/hadoop-master-1.hadoop.default.svc.cluster.local@CLUSTER.LOCAL (auth:KERBEROS) for protocol=interface org.apache.hadoop.ha.HAServiceProtocol

Anything I'm missing here?


Rising Star

Hi @akshaydalvi , I am not sure if you are using Cloudera Manager or not. Please confirm. Make sure your reflects what you are trying to change. You could add ",RFAS" line under the log4j safety valve. Be sure that you are using the right RollingAppender name RFA or RFAS. We generally use RFAS for SecurityLogger as shown below snippet.${}.audit

Here is how we configure the log4j safety valve in Cloudera Manager under HDFS -> Configuration

@Screenshot 2023-12-08 at 6.17.41 PM.png