Member since
08-08-2017
1652
Posts
30
Kudos Received
11
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2093 | 06-15-2020 05:23 AM | |
| 17446 | 01-30-2020 08:04 PM | |
| 2255 | 07-07-2019 09:06 PM | |
| 8723 | 01-27-2018 10:17 PM | |
| 4913 | 12-31-2017 10:12 PM |
08-16-2018
07:59 AM
we want to create document that include QA tests for ambari cluster tests should be like the following examples: 1. restart all machines clsuer without stop the services , and see that all services are up after boot 2. disconnect / connect one of the machine LAN cable and see the services/component on this machine are up after connect the LAN cable etc, we will happy to get kind of these tests , so we can perfrom the QA on our clsuters
... View more
Labels:
08-15-2018
08:40 PM
this are the new logs that I get now -rw-r--r-- 1 hive hadoop 389 Aug 15 19:28 hivemetastore.log-.10.log.gz
-rw-r--r--. 1 hive hadoop 0 Aug 15 19:29 hive.err
-rw-r--r--. 1 hive hadoop 31 Aug 15 19:29 hive.out
-rw-r--r--. 1 hive hadoop 0 Aug 15 19:29 hive-server2.out
-rw-r--r--. 1 hive hadoop 0 Aug 15 19:29 hive-server2.err
-rw-r--r-- 1 hive hadoop 1719 Aug 15 19:34 hivemetastore.log-.9.log.gz
-rw-r--r-- 1 hive hadoop 4357 Aug 15 19:34 hiveserver2-report.json
-rw-r--r-- 1 hive hadoop 4315 Aug 15 19:34 hivemetastore-report.json
-rw-r--r-- 1 hive hadoop 1240 Aug 15 19:42 hiveserver2.log-.10.log.gz
-rw-r--r-- 1 hive hadoop 1038 Aug 15 19:43 hivemetastore.log-.8.log.gz
-rw-r--r-- 1 hive hadoop 1159 Aug 15 19:48 hiveserver2.log-.9.log.gz
-rw-r--r-- 1 hive hadoop 1038 Aug 15 19:49 hivemetastore.log-.7.log.gz
-rw-r--r-- 1 hive hadoop 1150 Aug 15 19:54 hiveserver2.log-.8.log.gz
-rw-r--r-- 1 hive hadoop 1019 Aug 15 19:58 hivemetastore.log-.6.log.gz
-rw-r--r-- 1 hive hadoop 1168 Aug 15 20:00 hiveserver2.log-.7.log.gz
-rw-r--r-- 1 hive hadoop 1174 Aug 15 20:06 hiveserver2.log-.6.log.gz
-rw-r--r-- 1 hive hadoop 1016 Aug 15 20:07 hivemetastore.log-.5.log.gz
-rw-r--r-- 1 hive hadoop 1156 Aug 15 20:12 hiveserver2.log-.5.log.gz
-rw-r--r-- 1 hive hadoop 1019 Aug 15 20:13 hivemetastore.log-.4.log.gz
-rw-r--r-- 1 hive hadoop 1159 Aug 15 20:18 hiveserver2.log-.4.log.gz
-rw-r--r-- 1 hive hadoop 1028 Aug 15 20:22 hivemetastore.log-.3.log.gz
-rw-r--r-- 1 hive hadoop 1171 Aug 15 20:24 hiveserver2.log-.3.log.gz
-rw-r--r-- 1 hive hadoop 1167 Aug 15 20:30 hiveserver2.log-.2.log.gz
-rw-r--r-- 1 hive hadoop 1017 Aug 15 20:31 hivemetastore.log-.2.log.gz
-rw-r--r-- 1 hive hadoop 233 Aug 15 20:36 hiveserver2.log
-rw-r--r-- 1 hive hadoop 1166 Aug 15 20:36 hiveserver2.log-.1.log.gz
-rw-r--r-- 1 hive hadoop 1003 Aug 15 20:37 hivemetastore.log-.1.log.gz
-rw-r--r-- 1 hive hadoop 1218 Aug 15 20:37 hivemetastore.log
-rw-r--r-- 1 hive hadoop 4248 Aug 15 20:39 hivemetastore-report.json.tmp
-rw-r--r-- 1 hive hadoop 4363 Aug 15 20:39 hiveserver2-report.json.tmp
[root@master01 hive]#
[root@master01 hive]#
[root@master01 hive]#
[root@master01 hive]# du -sh *
0 hive.err
4.0K hivemetastore.log
4.0K hivemetastore.log-.10.log.gz
4.0K hivemetastore.log-.1.log.gz
4.0K hivemetastore.log-.2.log.gz
4.0K hivemetastore.log-.3.log.gz
4.0K hivemetastore.log-.4.log.gz
4.0K hivemetastore.log-.5.log.gz
4.0K hivemetastore.log-.6.log.gz
4.0K hivemetastore.log-.7.log.gz
4.0K hivemetastore.log-.8.log.gz
4.0K hivemetastore.log-.9.log.gz
8.0K hivemetastore-report.json
8.0K hivemetastore-report.json.tmp
4.0K hive.out
0 hive-server2.err
4.0K hiveserver2.log
4.0K hiveserver2.log-.10.log.gz
4.0K hiveserver2.log-.1.log.gz
4.0K hiveserver2.log-.2.log.gz
4.0K hiveserver2.log-.3.log.gz
4.0K hiveserver2.log-.4.log.gz
4.0K hiveserver2.log-.5.log.gz
4.0K hiveserver2.log-.6.log.gz
4.0K hiveserver2.log-.7.log.gz
4.0K hiveserver2.log-.8.log.gz
4.0K hiveserver2.log-.9.log.gz
0 hive-server2.out
8.0K hiveserver2-report.json
8.0K hiveserver2-report.json.tmp
... View more
08-15-2018
07:15 PM
ok , let me try to set it on my ambari , I will update ....
... View more
08-15-2018
07:08 PM
second my ambari is based on linux machines cluster so what is the line:
log4j.appender.DRFA.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy
... View more
08-15-2018
07:05 PM
do you mean that I need to replace my lines as log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.zip
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB to
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout log4j.appender.DRFA=org.apache.log4j.rolling.RollingFileAppender log4j.appender.DRFA.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy log4j.appender.DRFA.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy log4j.appender.DRFA.rollingPolicy.ActiveFileName=${hive.log.dir}/${hive.log.file}.log log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.gz log4j.appender.DRFA.triggeringPolicy.MaxFileSize=10000 log4j.appender.DRFA.rollingPolicy.maxIndex=10 ?
... View more
08-15-2018
06:34 PM
@Jay can yoiu help me regarding my last notes ?
... View more
08-15-2018
02:31 PM
I also configured the follwing ( without date ) but without zip alos log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%i.log.zip
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB
... View more
08-15-2018
02:08 PM
hi Jay we configured the follwing , but still rotated files are not ziped whart is wrong in my log4j? hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB
log4j.appender.DRFA.rollingPolicy.FileNamePattern=${hive.log.dir}/${hive.log.file}-.%d{yyyyMMdd}.log.gz
... View more
08-15-2018
01:22 PM
hi all we configure the HIVE , and log4j with RollingFileAppender log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB full details: # Define some default values that can be overridden by system properties
hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=1KB
# 30-day backup
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n the logs from the machine are: -rw-r--r-- 1 hive hadoop 1113 Aug 15 13:10 hivemetastore.log.10
-rw-r--r-- 1 hive hadoop 1028 Aug 15 13:10 hivemetastore.log.9
-rw-r--r-- 1 hive hadoop 1070 Aug 15 13:11 hivemetastore.log.8
-rw-r--r-- 1 hive hadoop 1239 Aug 15 13:12 hiveserver2.log.10
-rw-r--r-- 1 hive hadoop 1154 Aug 15 13:13 hivemetastore.log.7
-rw-r--r-- 1 hive hadoop 1133 Aug 15 13:13 hivemetastore.log.6
-rw-r--r-- 1 hive hadoop 1055 Aug 15 13:15 hiveserver2.log.9
-rw-r--r-- 1 hive hadoop 1203 Aug 15 13:15 hiveserver2.log.8
-rw-r--r-- 1 hive hadoop 1098 Aug 15 13:15 hiveserver2.log.7
-rw-r--r-- 1 hive hadoop 1028 Aug 15 13:15 hiveserver2.log.6
-rw-r--r-- 1 hive hadoop 1239 Aug 15 13:15 hiveserver2.log.5
-rw-r--r-- 1 hive hadoop 1113 Aug 15 13:16 hivemetastore.log.5
-rw-r--r-- 1 hive hadoop 1028 Aug 15 13:16 hivemetastore.log.4
-rw-r--r-- 1 hive hadoop 1070 Aug 15 13:16 hivemetastore.log.3
-rw-r--r-- 1 hive hadoop 1048 Aug 15 13:18 hiveserver2.log.4
-rw-r--r-- 1 hive hadoop 1173 Aug 15 13:18 hiveserver2.log.3
-rw-r--r-- 1 hive hadoop 1157 Aug 15 13:18 hiveserver2.log.2
-rw-r--r-- 1 hive hadoop 1239 Aug 15 13:18 hiveserver2.log.1
-rw-r--r-- 1 hive hadoop 503 Aug 15 13:18 hiveserver2.log
-rw-r--r-- 1 hive hadoop 1154 Aug 15 13:19 hivemetastore.log.2
-rw-r--r-- 1 hive hadoop 1133 Aug 15 13:19 hivemetastore.log.1
-rw-r--r-- 1 hive hadoop 292 Aug 15 13:19 hivemetastore.log
-rw-r--r-- 1 hive hadoop 4904 Aug 15 13:20 hivemetastore-report.json.tmp
-rw-r--r-- 1 hive hadoop 4273 Aug 15 13:20 hiveserver2-report.json.tmp my question is - all roteated logs as hiveserver2.log.1 , hiveserver2.log.2 , etc are not zipped log what is the change in log4j that I need to do in order to ziped the roteated fuiles?
... View more
Labels:
08-15-2018
09:56 AM
can I add the syntax on the end of the line - so the final line should be like this ? export AMS_COLLECTOR_GC_OPTS="-XX:+UseConcMarkSweepGC -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCDateStamps -Xloggc:{{ams_collector_log_dir}}/collector-gc.log-`date +'%Y%m%d%H%M'` -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=5 -XX:GCLogFileSize=4K"
... View more