<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95529#M8860</link>
    <description>&lt;P&gt;[root@hadm0301 ~]# free -c 5&lt;/P&gt;&lt;P&gt;  total  used  free  shared  buffers  cached&lt;/P&gt;&lt;P&gt;Mem:  148308580  121816216  26492364  0  24726180  75655352&lt;/P&gt;&lt;P&gt;-/+ buffers/cache:  21434684  126873896&lt;/P&gt;&lt;P&gt;Swap:  2097144  0  2097144&lt;/P&gt;</description>
    <pubDate>Fri, 16 Oct 2015 04:48:02 GMT</pubDate>
    <dc:creator>wgonzalez</dc:creator>
    <dc:date>2015-10-16T04:48:02Z</dc:date>
    <item>
      <title>Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95526#M8857</link>
      <description>&lt;P&gt;Seeing PSYoungGen GC on YARN/RM nodes aggressively every 8-10 secs: &lt;/P&gt;&lt;PRE&gt;6171.971: [GC [PSYoungGen: 136875K-&amp;gt;512K(144896K)] 252995K-&amp;gt;116647K(1542656K), 0.0070630 
secs] [Times: user=0.05 sys=0.00, real=0.01 secs] 
6180.179: [GC [PSYoungGen: 115578K-&amp;gt;480K(142336K)] 231714K-&amp;gt;116703K(1540096K), 0.0065030 
secs] [Times: user=0.04 sys=0.01, real=0.00 secs] 
6190.316: [GC [PSYoungGen: 141974K-&amp;gt;736K(142336K)] 258198K-&amp;gt;116975K(1540096K), 0.0059270 
secs] [Times: user=0.03 sys=0.01, real=0.01 secs] 
6200.145: [GC [PSYoungGen: 137784K-&amp;gt;672K(140800K)] 254024K-&amp;gt;116935K(1538560K), 0.0059910 
secs] [Times: user=0.04 sys=0.00, real=0.01 secs] 
6209.983: [GC [PSYoungGen: 138483K-&amp;gt;704K(139776K)] 254746K-&amp;gt;117079K(1537536K), 0.0060980 
secs] [Times: user=0.04 sys=0.00, real=0.01 secs] 
6220.037: [GC [PSYoungGen: 137973K-&amp;gt;544K(138752K)] 254349K-&amp;gt;117157K(1536512K), 0.0036380 
secs] [Times: user=0.02 sys=0.00, real=0.00 secs] 
6228.739: [GC [PSYoungGen: 117752K-&amp;gt;768K(137728K)] 234366K-&amp;gt;117437K(1535488K), 0.0061930 
secs] [Times: user=0.03 sys=0.02, real=0.00 secs] 
6237.006: [GC [PSYoungGen: 113053K-&amp;gt;768K(136704K)] 229723K-&amp;gt;117493K(1534464K), 0.0037210 
secs] [Times: user=0.03 sys=0.00, real=0.01 secs] 
6244.956: [GC [PSYoungGen: 107217K-&amp;gt;672K(135680K)] 223942K-&amp;gt;117589K(1533440K), 0.0080520 
secs] [Times: user=0.03 sys=0.03, real=0.01 secs] 
6255.009: [GC [PSYoungGen: 133601K-&amp;gt;512K(134656K)] 250518K-&amp;gt;117517K(1532416K), 0.0070160 
secs] [Times: user=0.04 sys=0.00, real=0.01 secs] 
6265.007: [GC [PSYoungGen: 131612K-&amp;gt;672K(134144K)] 248618K-&amp;gt;117821K(1531904K), 0.0039770 
secs] [Times: user=0.04 sys=0.00, real=0.00 secs] 
6274.880: [GC [PSYoungGen: 129679K-&amp;gt;815K(132608K)] 246828K-&amp;gt;118132K(1530368K), 0.0036740 
secs] [Times: user=0.03 sys=0.00, real=0.01 secs] 
6284.782: [GC [PSYoungGen: 129427K-&amp;gt;608K(132096K)] 246744K-&amp;gt;118164K(1529856K), 0.0039850 
secs] [Times: user=0.04 sys=0.00, real=0.01 secs] 
6294.809: [GC [PSYoungGen: 129803K-&amp;gt;576K(130560K)] 247360K-&amp;gt;118260K(1528320K), 0.0037790 
secs] [Times: user=0.03 sys=0.00, real=0.00 secs] 
6297.736: [GC [PSYoungGen: 39607K-&amp;gt;544K(129536K)] 157291K-&amp;gt;118316K(1527296K), 0.0035240 secs] 
[Times: user=0.02 sys=0.00, real=0.00 secs] &lt;/PRE&gt;&lt;P&gt;Is this normal? &lt;/P&gt;&lt;P&gt;Is there a way to properly tweak it? &lt;/P&gt;&lt;P&gt;Looks like the new size is set to 144M, should this be increased? &lt;/P&gt;&lt;P&gt;Thank you ! &lt;/P&gt;</description>
      <pubDate>Fri, 16 Oct 2015 02:49:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95526#M8857</guid>
      <dc:creator>wgonzalez</dc:creator>
      <dc:date>2015-10-16T02:49:58Z</dc:date>
    </item>
    <item>
      <title>Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95527#M8858</link>
      <description>&lt;P&gt;Are you running any long running or I/O  or memory intensive job? &lt;/P&gt;&lt;PRE&gt;Please post output of 

free -g&lt;/PRE&gt;</description>
      <pubDate>Fri, 16 Oct 2015 03:10:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95527#M8858</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2015-10-16T03:10:33Z</dc:date>
    </item>
    <item>
      <title>Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95528#M8859</link>
      <description>&lt;P&gt;Yes but it varies heavily depending the day/time. At times, it could be intensive. I am not sure it is heavy at this moment but let me get that&lt;/P&gt;</description>
      <pubDate>Fri, 16 Oct 2015 03:14:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95528#M8859</guid>
      <dc:creator>wgonzalez</dc:creator>
      <dc:date>2015-10-16T03:14:35Z</dc:date>
    </item>
    <item>
      <title>Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95529#M8860</link>
      <description>&lt;P&gt;[root@hadm0301 ~]# free -c 5&lt;/P&gt;&lt;P&gt;  total  used  free  shared  buffers  cached&lt;/P&gt;&lt;P&gt;Mem:  148308580  121816216  26492364  0  24726180  75655352&lt;/P&gt;&lt;P&gt;-/+ buffers/cache:  21434684  126873896&lt;/P&gt;&lt;P&gt;Swap:  2097144  0  2097144&lt;/P&gt;</description>
      <pubDate>Fri, 16 Oct 2015 04:48:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95529#M8860</guid>
      <dc:creator>wgonzalez</dc:creator>
      <dc:date>2015-10-16T04:48:02Z</dc:date>
    </item>
    <item>
      <title>Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95530#M8861</link>
      <description>&lt;P&gt;@&lt;A href="http://community.hortonworks.com/users/228/wgonzalez.html"&gt;William Gonzalez&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Whats the output of free -g during the job execution? Do you see warnings during the idle time? &lt;/P&gt;&lt;P&gt;Were you able to see any error messages?&lt;/P&gt;</description>
      <pubDate>Mon, 19 Oct 2015 19:12:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95530#M8861</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2015-10-19T19:12:16Z</dc:date>
    </item>
    <item>
      <title>Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95531#M8862</link>
      <description>&lt;P&gt;Thanks for the reply all. This was a non issue. &lt;/P&gt;&lt;P&gt;Ambari is reading JMX and JMX is incorrectly aggregating GC Times. &lt;/P&gt;&lt;P&gt;Similar to: &lt;A href="https://issues.apache.org/jira/browse/AMBARI-3178" target="_blank"&gt;https://issues.apache.org/jira/browse/AMBARI-3178&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Thanks for the help &lt;/P&gt;</description>
      <pubDate>Tue, 20 Oct 2015 01:06:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95531#M8862</guid>
      <dc:creator>wgonzalez</dc:creator>
      <dc:date>2015-10-20T01:06:58Z</dc:date>
    </item>
    <item>
      <title>Re: Seeing PSYoungGen GC on YARN nodes aggressively every 8-10 secs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95532#M8863</link>
      <description>&lt;P&gt;IMO, this doesn't look bad at all. While you could tune the young generation size a bit higher to lessen these, the amount of time spent in GC is pretty low, so it's unlikely to have any impact on long term performance. We'd also need to see entries for the Perm Gen and Old Gen to determine what impact increasing the young gen would have.&lt;/P&gt;&lt;P&gt;Let's break it down:&lt;/P&gt;&lt;P&gt;This is a Young Generation collection, also known as a minor collection. The total heap used by the young generation hovers around 135mb, which aligns with your setting. The size of the young gen before GC is hovering around 130mb (some times less, as heap needs for objects will determine when the GC is needed). After GC, the heap is 1mb, meaning clean up went very well and most objects are short lived for this application.&lt;/P&gt;&lt;P&gt;Ultimately, these collections took on average 5ms each (.005 seconds) each 8 seconds or so (8000 ms), under .0001% of the total run time of the application, which is perfectly fine.&lt;/P&gt;</description>
      <pubDate>Tue, 20 Oct 2015 01:24:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Seeing-PSYoungGen-GC-on-YARN-nodes-aggressively-every-8-10/m-p/95532#M8863</guid>
      <dc:creator>skumpf</dc:creator>
      <dc:date>2015-10-20T01:24:44Z</dc:date>
    </item>
  </channel>
</rss>

