Support Questions

Find answers, ask questions, and share your expertise

hive log4j are very strange format

hi all

under /var/log/hive I can see the following logs ( example )

I not understand why logs became with "-20180803" ?

because I not have this structure in my hive-log4g

-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.29-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.29-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.30
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.30-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.30-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.30-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.3-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.3-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.3-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.4-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.4-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.4-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.5-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.5-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.5-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.6-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.6-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.6-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.7-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.7-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.7-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.8-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.8-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.8-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.9-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.9-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.9-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hive-server2.out-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hive-server2.out-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hive-server2.out-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180803
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180803-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180803-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180803-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180802-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180803
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180803-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180803-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180803-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180803
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180803-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180803-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180803-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180802-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180803
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180803-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180803-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180803-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2-report.json.tmp-20180805

example what we have in hive-log4j

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Define some default values that can be overridden by system properties
hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=100MB
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
# 30-day backup
#log4j.appender.DRFA.MaxBackupIndex=30
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n
#
# console
# Add "console" to rootlogger above if you want to use this
#
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n
log4j.appender.console.encoding=UTF-8
#custom logging levels
#log4j.logger.xxx=DEBUG
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter
log4j.category.DataNucleus=ERROR,DRFA
log4j.category.Datastore=ERROR,DRFA
log4j.category.Datastore.Schema=ERROR,DRFA
log4j.category.JPOX.Datastore=ERROR,DRFA
log4j.category.JPOX.Plugin=ERROR,DRFA
log4j.category.JPOX.MetaData=ERROR,DRFA
log4j.category.JPOX.Query=ERROR,DRFA
log4j.category.JPOX.General=ERROR,DRFA
log4j.category.JPOX.Enhancer=ERROR,DRFA
# Silence useless ZK logs
log4j.logger.org.apache.zookeeper.server.NIOServerCnxn=WARN,DRFA
log4j.logger.org.apache.zookeeper.ClientCnxnSocketNIO=WARN,DRFA
Michael-Bronson
5 REPLIES 5

@Michael Bronson

The issue is with incorrect log4j, which is using RFA under DRFA.

Try setting the hive-log4j as below:

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log

log4j.rootLogger=${hive.root.logger}, EventCounter

log4j.threshold=${hive.log.threshold}

log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender

log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}

log4j.appender.DRFA.DatePattern=.yyyy-MM-dd

log4j.appender.DRFA.MaxBackupIndex= 10
log4j.appender.DRFA.MaxFileSize = 100MB
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout

log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n

log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n
log4j.appender.console.encoding=UTF-8

log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter

log4j.category.DataNucleus=ERROR,DRFA
log4j.category.Datastore=ERROR,DRFA
log4j.category.Datastore.Schema=ERROR,DRFA
log4j.category.JPOX.Datastore=ERROR,DRFA
log4j.category.JPOX.Plugin=ERROR,DRFA
log4j.category.JPOX.MetaData=ERROR,DRFA
log4j.category.JPOX.Query=ERROR,DRFA
log4j.category.JPOX.General=ERROR,DRFA
log4j.category.JPOX.Enhancer=ERROR,DRFA


log4j.logger.org.apache.zookeeper.server.NIOServerCnxn=WARN,DRFA
log4j.logger.org.apache.zookeeper.ClientCnxnSocketNIO=WARN,DRFA

The logs would be as:

[root@ hive]# ls -ltr
total 3472
-rw-r--r-- 1 hive hadoop 1560203 Jul 26 23:59 hiveserver2.log.2018-07-26
-rw-r--r-- 1 hive hadoop 1128929 Jul 27 10:59 hiveserver2.log.2018-07-27
-rw-r--r-- 1 hive hadoop       0 Aug  6 07:22 hive-server2.out
-rw-r--r-- 1 hive hadoop      88 Aug  6 07:23 hive-server2.err
-rw-r--r-- 1 hive hadoop  799614 Aug  6 14:19 hiveserver2.log

@sindhu , we configured the log4j to work with RollingFileAppender and MaxBackupIndex for 10 backup's and we disable the log4j.appender.DRFA.DatePattern=.yyyy-MM-dd , so I not understand what is wrong with my hive-log4j configuration , could you please mark the incorrect lines in my hive-log4j

Michael-Bronson

second in my hive-log4j I not see that we are using RFA , where you see that?

Michael-Bronson

this is example of my logs:

hive.err
hive.out
hive-server2.out
hive-server2.err
hivemetastore.log.10
hivemetastore.log.9
hivemetastore.log.8
hivemetastore.log.7
hivemetastore.log.6
hiveserver2.log.10
hiveserver2.log.9
hiveserver2.log.8
hiveserver2.log.7
hiveserver2.log.6
hivemetastore.log.5
hivemetastore.log.4
hivemetastore.log.3
hiveserver2.log.5
hiveserver2.log.4
hiveserver2.log.3
hiveserver2.log.2
hiveserver2.log.1
hiveserver2.log
hivemetastore.log.2
hivemetastore.log.1
hivemetastore.log
hivemetastore-report.json.tmp
hiveserver2-report.json.tmp
You have mail in /var/spool/
Michael-Bronson

but additionaly I see also this files

so from why we got this syntax - "-20180805" , from where ?

hiveserver2.log.30-20180804
hiveserver2.log.30-20180804-20180805
hiveserver2.log.30-20180805
hiveserver2.log.3-20180804
hiveserver2.log.3-20180804-20180805
hiveserver2.log.3-20180805
hiveserver2.log.4-20180804
hiveserver2.log.4-20180804-20180805
hiveserver2.log.4-20180805
hiveserver2.log.5-20180804
hiveserver2.log.5-20180804-20180805
hiveserver2.log.5-20180805
hiveserver2.log.6-20180804
hiveserver2.log.6-20180804-20180805
hiveserver2.log.6-20180805
hiveserver2.log.7-20180804
hiveserver2.log.7-20180804-20180805
hiveserver2.log.7-20180805
hiveserver2.log.8-20180804
hiveserver2.log.8-20180804-20180805
hiveserver2.log.8-20180805
hiveserver2.log.9-20180804
hiveserver2.log.9-20180804-20180805
hiveserver2.log.9-20180805
hive-server2.out-20180804
hive-server2.out-20180804-20180805
hive-server2.out-20180805
hiveserver2-report.json
hiveserver2-report.json-20180802-20180803
hiveserver2-report.json-20180802-20180803-20180804
hiveserver2-report.json-20180802-20180803-20180804-20180805
hiveserver2-report.json-20180802-20180803-20180805
hiveserver2-report.json-20180802-20180804
hiveserver2-report.json-20180802-20180804-20180805
hiveserver2-report.json-20180802-20180805
hiveserver2-report.json-20180803
Michael-Bronson