Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

cannot insert data into hive table with oom error

Highlighted

cannot insert data into hive table with oom error

Contributor
when I run this script in hive view:

insert overwrite table t2

partition (year='2016')

select *

from t1 e

where e.tradedate >= '2016-01-01' and e.tradedate <= '2016-12-31'

got this error:
2016-11-01 04:33:05,064 [FATAL] [LeaseRenewer:root@hdfs1.wdp:8020] |yarn.YarnUncaughtExceptionHandler|: Thread Thread[LeaseRenewer:root@hdfs1.wdp:8020,5,main] threw an Error.  Shutting down now...
java.lang.OutOfMemoryError: Java heap space
2016-11-01 04:33:05,064 [WARN] [ResponseProcessor for block BP-1445878840-192.168.0.58-1469702512253:blk_1081464217_7726135] |hdfs.DFSClient|: DFSOutputStream ResponseProcessor exception  for block BP-1445878840-192.168.0.58-1469702512253:blk_1081464217_7726135
java.io.EOFException: Premature EOF: no length prefix available
	at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2293)
	at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:244)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer$ResponseProcessor.run(DFSOutputStream.java:748)
2016-11-01 04:33:05,064 [FATAL] [Socket Reader #1 for port 39908] |yarn.YarnUncaughtExceptionHandler|: Thread Thread[Socket Reader #1 for port 39908,5,main] threw an Error.  Shutting down now...
java.lang.OutOfMemoryError: Java heap space
	at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcRequestHeaderProto$Builder.create(RpcHeaderProtos.java:2173)
	at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcRequestHeaderProto$Builder.access$2300(RpcHeaderProtos.java:2141)
	at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcRequestHeaderProto.newBuilder(RpcHeaderProtos.java:2121)
	at org.apache.hadoop.ipc.Server$Connection.processOneRpc(Server.java:1892)
	at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1666)
	at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:880)
	at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:746) 
	at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:717)

any suggesion? thanks in advance.

6 REPLIES 6
Highlighted

Re: cannot insert data into hive table with oom error

Super Guru

@boyer

If you are using TEZ as default hive execution engine then try to increase the TEZ container size. If you are using MR then you need to revisit container memory configuration. Make sure that you have enough free memory on all your nodemanagers and try to increase the container size and heap space (Generally - 0.80*container memory) for Mapper/Reducer containers.

Highlighted

Re: cannot insert data into hive table with oom error

Contributor

thanks for your help.

I used tez and set the tez container size 4g / HiveServer2 Heap Size 3.5g / Metastore Heap Size 3.5g / Client Heap Size 3.5g, but still got error.

Highlighted

Re: cannot insert data into hive table with oom error

Expert Contributor

what is your yarn container size set to..?

Highlighted

Re: cannot insert data into hive table with oom error

Contributor

min 512m, max 10240m

Highlighted

Re: cannot insert data into hive table with oom error

Expert Contributor

try to change the container size max to 2 gig and min to 1 gig and see if that helps.

Re: cannot insert data into hive table with oom error

Expert Contributor

Did it work, were you able to process data by increasing the yarn container size.

Don't have an account?
Coming from Hortonworks? Activate your account here