Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here. Want to know more about what has changed? Check out the Community News blog.

Java heap space

SOLVED Go to solution
Highlighted

Java heap space

New Contributor

I have Configured HDFS & HIVE on Fedora 20 (32 bit linux) on VM. HDFS and HIVE is running properly on VM, but problem occurs when trying to connect through any external tool or program like jasper or java.

PFB the error for reference-

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116) at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103) at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192) at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132) at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:133) at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:122) at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) at java.sql.DriverManager.getConnection(Unknown Source) at java.sql.DriverManager.getConnection(Unknown Source) at Hive_test.main(Hive_test.java:20)

Following JARS i have included-

ant-1.8.1.jar apache-httpcomponents-httpclient.jar apache-httpcomponents-httpcore.jar commons-el-1.0-javadoc.jar commons-el-1.0-sources.jar commons-el-1.0.jar commons-logging-1.1.1.jar commons-logging-api-1.0.4.jar connector-sdk-1.99.5.jar hadoop-common-2.6.0.2.2.4.2-2.jar hive-beeline-0.12.0.jar hive-cli-0.12.0.jar hive-common-0.12.0.jar hive-contrib-0.12.0.jar hive-exec-0.12.0.jar hive-exec-0.8.0.jar hive-exec-0.8.0.jar.zip hive-io-exp-core-0.6.jar.zip hive-io-exp-deps-0.6-sources.jar.zip hive-io-exp-deps-0.6.jar.zip hive-jdbc-0.12.0.jar hive-metastore-0.12.0.jar hive-service-0.12.0.jar httpclient-4.2.5.jar httpcore-4.2.4.jar jackson-core-asl-1.8.8.jar jasper-compiler-5.5.23.jar jasper-runtime-5.5.23.jar jsp-api-2.1-sources.jar jsp-api-2.1.jar junit-4.11-javadoc.jar junit-4.11-sources.jar junit-4.11.jar libfb303-0.9.0.jar libthrift-0.9.0.jar log4j-1.2.16-javadoc.jar log4j-1.2.16-sources.jar log4j-1.2.16.jar ql.jar slf4j-api-1.6.1-javadoc.jar slf4j-api-1.6.1-sources.jar slf4j-api-1.6.1.jar slf4j-log4j12-1.6.1.jar sqoop-connector-kite-1.99.5.jar

So please suggest any corrective action.

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Java heap space

Could you post some of you heap configurations? How much memory is available on the machine? OOM error usually means the heap configuration is not correct or their is not enough memory available on the machine.

You also might want to check the open files limit (ulimit -a), if its too low it can cause OOM errors. (see https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/ref-729...)

Even though you might be able to run Hadoop on a 32bit system, I wouldn't recommend it. You should use a 64bit system (see http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/meet-min...)

3 REPLIES 3

Re: Java heap space

@Kumar Ratan

System is running out of memory

Hive is trying to create tez container and system does not have enough Memory

Check the vm memory and see if you can increase it

Re: Java heap space

Could you post some of you heap configurations? How much memory is available on the machine? OOM error usually means the heap configuration is not correct or their is not enough memory available on the machine.

You also might want to check the open files limit (ulimit -a), if its too low it can cause OOM errors. (see https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/ref-729...)

Even though you might be able to run Hadoop on a 32bit system, I wouldn't recommend it. You should use a 64bit system (see http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/meet-min...)

Re: Java heap space

Mentor

@Kumar Ratan are you still having issues with this? Can you accept best answer or provide your workaround?