Member since
08-08-2017
1652
Posts
30
Kudos Received
11
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1944 | 06-15-2020 05:23 AM | |
| 15807 | 01-30-2020 08:04 PM | |
| 2093 | 07-07-2019 09:06 PM | |
| 8167 | 01-27-2018 10:17 PM | |
| 4639 | 12-31-2017 10:12 PM |
08-27-2018
06:57 AM
@Jay we still get the same results ( we set 10K ) - as gc.log-201808270653.0.current , gc.log-201808270656.0.current -rw-r--r-- 1 hdfs hadoop 3859 Aug 27 06:53 gc.log-201808270629.0.current
-rw-r--r-- 1 hdfs hadoop 727 Aug 27 06:53 hadoop-hdfs-datanode-worker01.sys76.com.out.1
-rw-r--r-- 1 hdfs hadoop 3653 Aug 27 06:56 gc.log-201808270653.0.current
-rw-r--r-- 1 hdfs hadoop 727 Aug 27 06:56 hadoop-hdfs-datanode-worker01.sys76.com.out
-rw-r--r-- 1 hdfs hadoop 518292 Aug 27 06:56 hadoop-hdfs-datanode-worker01.sys76.com.log
-rw-r--r-- 1 hdfs hadoop 3658 Aug 27 06:56 gc.log-201808270656.0.current
... View more
08-27-2018
06:38 AM
hi all we are trying to rotate the gc logs on workers machines according to the following we add the line -XX:+PrintGCTimeStamps -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=5 -XX:GCLogFileSize=2K inside HADOOP_DATANODE_OPTS variable in HDFS --> Advanced hadoop-env as the following: <br>export HADOOP_DATANODE_OPTS="-server -XX:ParallelGCThreads=4 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/$USER/hs_err_pid%p.log -XX:NewSize=200m -XX:MaxNewSize=200m -Xloggc:/var/log/hadoop/$USER/gc.log-`date +'%Y%m%d%H%M'` -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=5 -XX:GCLogFileSize=2K -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms{{dtnode_heapsize}} -Xmx{{dtnode_heapsize}} -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT ${HADOOP_DATANODE_OPTS}" but we get on worker machine ( data node under /var/log/hadoop/hdfs ) , the follwing -rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:33 hadoop-hdfs-datanode-worker01.sys76.com.log.10
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:35 hadoop-hdfs-datanode-worker01.sys76.com.log.9
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:36 hadoop-hdfs-datanode-worker01.sys76.com.log.8
-rw-r--r-- 1 hdfs hadoop 1504 Aug 26 20:37 hadoop-hdfs-datanode-worker01.sys76.com.log.7
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:38 hadoop-hdfs-datanode-worker01.sys76.com.log.6
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:40 hadoop-hdfs-datanode-worker01.sys76.com.log.5
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:41 hadoop-hdfs-datanode-worker01.sys76.com.log.4
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:43 hadoop-hdfs-datanode-worker01.sys76.com.log.3
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:44 hadoop-hdfs-datanode-worker01.sys76.com.log.2
-rw-r--r-- 1 hdfs hadoop 1035 Aug 26 20:46 hadoop-hdfs-datanode-worker01.sys76.com.log.1
-rw-r--r-- 1 hdfs hadoop 727 Aug 26 20:46 hadoop-hdfs-datanode-worker01.sys76.com.out.4
-rw-r--r-- 1 hdfs hadoop 3905 Aug 26 20:46 gc.log-201808262031.0.current
-rw-r--r-- 1 hdfs hadoop 727 Aug 26 20:46 hadoop-hdfs-datanode-worker01.sys76.com.out.3
-rw-r--r--. 1 hdfs hadoop 101346 Aug 26 21:53 SecurityAuth.audit
-rw-r--r-- 1 hdfs hadoop 7488 Aug 27 06:25 gc.log-201808262046
-rw-r--r-- 1 hdfs hadoop 727 Aug 27 06:28 hadoop-hdfs-datanode-worker01.sys76.com.out.2
-rw-r--r-- 1 hdfs hadoop 3651 Aug 27 06:28 gc.log-201808270625.0.current
-rw-r--r-- 1 hdfs hadoop 727 Aug 27 06:29 hadoop-hdfs-datanode-worker01.sys76.com.out.1
-rw-r--r-- 1 hdfs hadoop 727 Aug 27 06:29 hadoop-hdfs-datanode-worker01.sys76.com.out
-rw-r--r-- 1 hdfs hadoop 2940 Aug 27 06:29 gc.log-201808270629.0.current
-rw-r--r-- 1 hdfs hadoop 378069 Aug 27 06:34 hadoop-hdfs-datanode-worker01.sys76.com.log why gc.log.... are not rotated and comes with "0.current" what is wrong with my configuration? ** we also try the following but this isn't worked In hadoop-env template wherever you find -XX:+PrintGCDetails parameter you can add -XX:+PrintGCTimeStamps -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=5 -XX:GCLogFileSize=2M For example: From: HADOOP_JOBTRACKER_OPTS="-server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile={{hdfs_log_dir_prefix}}/$USER/hs_err_pid%p.log -XX:NewSize={{jtnode_opt_newsize}} -XX:MaxNewSize={{jtnode_opt_maxnewsize}} -Xloggc:{{hdfs_log_dir_prefix}}/$USER/gc.log-`date +'%Y%m%d%H%M'` -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xmx{{jtnode_heapsize}} -Dhadoop.security.logger=INFO,DRFAS -Dmapred.audit.logger=INFO,MRAUDIT -Dhadoop.mapreduce.jobsummary.logger=INFO,JSA ${HADOOP_JOBTRACKER_OPTS}" To: HADOOP_JOBTRACKER_OPTS="-server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile={{hdfs_log_dir_prefix}}/$USER/hs_err_pid%p.log -XX:NewSize={{jtnode_opt_newsize}} -XX:MaxNewSize={{jtnode_opt_maxnewsize}} -Xloggc:{{hdfs_log_dir_prefix}}/$USER/gc.log-`date +'%Y%m%d%H%M'` -verbose:gc -XX:+PrintGCDetails <strong>-XX:+PrintGCTimeStamps -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=5 -XX:GCLogFileSize=2M </strong>-XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xmx{{jtnode_heapsize}} -Dhadoop.security.logger=INFO,DRFAS -Dmapred.audit.logger=INFO,MRAUDIT -Dhadoop.mapreduce.jobsummary.logger=INFO,JSA ${HADOOP_JOBTRACKER_OPTS}"
... View more
Labels:
08-19-2018
08:10 AM
@Jay - this already worked according to pappu ( see his answer about cp /usr/hdp/current/hive-server2/lib/apache-log4j-extras-1.2.17.jar /usr/lib/ams-hbase/lib/ ) ( , my problem is about - ambari-metrics-collector.log that rotated but not zipped ) , see also my current conf ams-log4j.txt
... View more
08-18-2018
06:36 PM
do you have some direction ? , if this issue cant be solved then it is ok , I will not do the log4j with zip option
... View more
08-16-2018
09:05 PM
ls -ltr /usr/lib/ambari-metrics-collector/apache-log4j-extras-1.2.17.jar
-rw-r--r-- 1 root root 448794 Aug 16 18:51 /usr/lib/ambari-metrics-collector/apache-log4j-extras-1.2.17.jar
... View more
08-16-2018
08:25 PM
also log4j:WARN Failed to set property [triggeringPolicy] to value "org.apache.log4j.rolling.SizeBasedTriggeringPolicy".
log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.FixedWindowRollingPolicy".
log4j:WARN Please set a rolling policy for the RollingFileAppender named 'file'
... View more
08-16-2018
08:21 PM
what is mean - log4j:ERROR No output stream or file set for the appender named [file]. ?
... View more
08-16-2018
07:11 PM
more ambari-metrics-collector.out
log4j:WARN Failed to set property [triggeringPolicy] to value "org.apache.log4j.rolling.SizeBasedTriggeringPolicy".
log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.FixedWindowRollingPolicy".
log4j:WARN Please set a rolling policy for the RollingFileAppender named 'file'
log4j:ERROR No output stream or file set for the appender named [file].
Aug 16, 2018 7:04:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider as a provider class
Aug 16, 2018 7:04:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.applicationhistoryservice.webapp.AHSWebServices as a root resource class
Aug 16, 2018 7:04:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.applicationhistoryservice.webapp.TimelineWebServices as a root resource class
Aug 16, 2018 7:04:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 16, 2018 7:04:28 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.11 12/09/2011 10:27 AM'
Aug 16, 2018 7:04:29 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 16, 2018 7:04:29 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
Aug 16, 2018 7:04:29 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.applicationhistoryservice.webapp.AHSWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 16, 2018 7:04:29 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.applicationhistoryservice.webapp.TimelineWebServices to GuiceManagedComponentProvider with the scope "Singleton"
... View more