<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Java heap space in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103101#M15395</link>
    <description>&lt;P&gt;I have Configured HDFS &amp;amp; HIVE on Fedora 20 (32 bit linux) on VM. HDFS and HIVE is running properly on VM, but problem occurs when trying to connect through any external tool or program like jasper or java.&lt;/P&gt;&lt;P&gt;PFB the error for reference-&lt;/P&gt;&lt;P&gt;Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:133)
at org.apache.hadoop.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at Hive_test.main(Hive_test.java:20)&lt;/P&gt;&lt;P&gt;Following JARS i have included-&lt;/P&gt;&lt;P&gt;ant-1.8.1.jar
apache-httpcomponents-httpclient.jar
apache-httpcomponents-httpcore.jar
commons-el-1.0-javadoc.jar
commons-el-1.0-sources.jar
commons-el-1.0.jar
commons-logging-1.1.1.jar
commons-logging-api-1.0.4.jar
connector-sdk-1.99.5.jar
hadoop-common-2.6.0.2.2.4.2-2.jar
hive-beeline-0.12.0.jar
hive-cli-0.12.0.jar
hive-common-0.12.0.jar
hive-contrib-0.12.0.jar
hive-exec-0.12.0.jar
hive-exec-0.8.0.jar
hive-exec-0.8.0.jar.zip
hive-io-exp-core-0.6.jar.zip
hive-io-exp-deps-0.6-sources.jar.zip
hive-io-exp-deps-0.6.jar.zip
hive-jdbc-0.12.0.jar
hive-metastore-0.12.0.jar
 hive-service-0.12.0.jar
 httpclient-4.2.5.jar
 httpcore-4.2.4.jar
 jackson-core-asl-1.8.8.jar
 jasper-compiler-5.5.23.jar
 jasper-runtime-5.5.23.jar
 jsp-api-2.1-sources.jar
 jsp-api-2.1.jar
 junit-4.11-javadoc.jar
 junit-4.11-sources.jar
 junit-4.11.jar
 libfb303-0.9.0.jar
 libthrift-0.9.0.jar
 log4j-1.2.16-javadoc.jar
 log4j-1.2.16-sources.jar
 log4j-1.2.16.jar
 ql.jar
 slf4j-api-1.6.1-javadoc.jar
 slf4j-api-1.6.1-sources.jar
 slf4j-api-1.6.1.jar
 slf4j-log4j12-1.6.1.jar
 sqoop-connector-kite-1.99.5.jar&lt;/P&gt;&lt;P&gt;So please suggest any corrective action.&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 09:58:08 GMT</pubDate>
    <dc:creator>kr_ratan</dc:creator>
    <dc:date>2022-09-16T09:58:08Z</dc:date>
    <item>
      <title>Java heap space</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103101#M15395</link>
      <description>&lt;P&gt;I have Configured HDFS &amp;amp; HIVE on Fedora 20 (32 bit linux) on VM. HDFS and HIVE is running properly on VM, but problem occurs when trying to connect through any external tool or program like jasper or java.&lt;/P&gt;&lt;P&gt;PFB the error for reference-&lt;/P&gt;&lt;P&gt;Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:133)
at org.apache.hadoop.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at Hive_test.main(Hive_test.java:20)&lt;/P&gt;&lt;P&gt;Following JARS i have included-&lt;/P&gt;&lt;P&gt;ant-1.8.1.jar
apache-httpcomponents-httpclient.jar
apache-httpcomponents-httpcore.jar
commons-el-1.0-javadoc.jar
commons-el-1.0-sources.jar
commons-el-1.0.jar
commons-logging-1.1.1.jar
commons-logging-api-1.0.4.jar
connector-sdk-1.99.5.jar
hadoop-common-2.6.0.2.2.4.2-2.jar
hive-beeline-0.12.0.jar
hive-cli-0.12.0.jar
hive-common-0.12.0.jar
hive-contrib-0.12.0.jar
hive-exec-0.12.0.jar
hive-exec-0.8.0.jar
hive-exec-0.8.0.jar.zip
hive-io-exp-core-0.6.jar.zip
hive-io-exp-deps-0.6-sources.jar.zip
hive-io-exp-deps-0.6.jar.zip
hive-jdbc-0.12.0.jar
hive-metastore-0.12.0.jar
 hive-service-0.12.0.jar
 httpclient-4.2.5.jar
 httpcore-4.2.4.jar
 jackson-core-asl-1.8.8.jar
 jasper-compiler-5.5.23.jar
 jasper-runtime-5.5.23.jar
 jsp-api-2.1-sources.jar
 jsp-api-2.1.jar
 junit-4.11-javadoc.jar
 junit-4.11-sources.jar
 junit-4.11.jar
 libfb303-0.9.0.jar
 libthrift-0.9.0.jar
 log4j-1.2.16-javadoc.jar
 log4j-1.2.16-sources.jar
 log4j-1.2.16.jar
 ql.jar
 slf4j-api-1.6.1-javadoc.jar
 slf4j-api-1.6.1-sources.jar
 slf4j-api-1.6.1.jar
 slf4j-log4j12-1.6.1.jar
 sqoop-connector-kite-1.99.5.jar&lt;/P&gt;&lt;P&gt;So please suggest any corrective action.&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 09:58:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103101#M15395</guid>
      <dc:creator>kr_ratan</dc:creator>
      <dc:date>2022-09-16T09:58:08Z</dc:date>
    </item>
    <item>
      <title>Re: Java heap space</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103102#M15396</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2237/krratan.html" nodeid="2237"&gt;@Kumar Ratan&lt;/A&gt;&lt;/P&gt;&lt;P&gt;System is running out of memory&lt;/P&gt;&lt;P&gt;Hive is trying to create tez container and system does not have enough Memory&lt;/P&gt;&lt;P&gt;Check the vm memory and see if you can increase it&lt;/P&gt;</description>
      <pubDate>Tue, 19 Jan 2016 12:04:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103102#M15396</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-01-19T12:04:51Z</dc:date>
    </item>
    <item>
      <title>Re: Java heap space</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103103#M15397</link>
      <description>&lt;P&gt;Could you post some of you heap configurations? How much memory is available on the machine? OOM error usually means the heap configuration is not correct or their is not enough memory available on the machine. &lt;/P&gt;&lt;P&gt;You also might want to check the open files limit (ulimit -a), if its too low it can cause OOM errors. (see &lt;A target="_blank" href="https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/ref-729d1fb0-6d1b-459f-a18a-b5eba4540ab5.1.html"&gt;https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/ref-729d1fb0-6d1b-459f-a18a-b5eba4540ab5.1.html&lt;/A&gt;)&lt;/P&gt;&lt;P&gt;Even though you might be able to run Hadoop on a 32bit system, I wouldn't recommend it. You should use a 64bit system (see &lt;A target="_blank" href="http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/meet-min-system-requirements.html"&gt;http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/meet-min-system-requirements.html&lt;/A&gt;)&lt;/P&gt;</description>
      <pubDate>Tue, 19 Jan 2016 14:03:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103103#M15397</guid>
      <dc:creator>jstraub</dc:creator>
      <dc:date>2016-01-19T14:03:59Z</dc:date>
    </item>
    <item>
      <title>Re: Java heap space</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103104#M15398</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2237/krratan.html" nodeid="2237"&gt;@Kumar Ratan&lt;/A&gt; are you still having issues with this? Can you accept best answer or provide your workaround?&lt;/P&gt;</description>
      <pubDate>Wed, 03 Feb 2016 09:43:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Java-heap-space/m-p/103104#M15398</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-02-03T09:43:14Z</dc:date>
    </item>
  </channel>
</rss>

