Support Questions

Find answers, ask questions, and share your expertise

Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large. May be malicious. Use CodedInputStream.setSizeLimit() to increase the size limit.

avatar
Rising Star

My hive query is failing even after setting the property ipc.maximum.data.length to the end limit of int which is 2147483647 with the following stacktrace:

Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large.  May be malicious.  Use CodedInputStream.setSizeLimit() to increase the size limit.
	at com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
	at com.google.protobuf.CodedInputStream.readRawByte(CodedInputStream.java:769)
	at com.google.protobuf.CodedInputStream.readRawVarint64(CodedInputStream.java:462)
	at com.google.protobuf.CodedInputStream.readUInt64(CodedInputStream.java:188)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.<init>(OrcProto.java:4330)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.<init>(OrcProto.java:4280)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics$1.parsePartialFrom(OrcProto.java:4454)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics$1.parsePartialFrom(OrcProto.java:4449)
	at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics.<init>(OrcProto.java:12224)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics.<init>(OrcProto.java:12171)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics$1.parsePartialFrom(OrcProto.java:12260)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics$1.parsePartialFrom(OrcProto.java:12255)
	at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.<init>(OrcProto.java:12898)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.<init>(OrcProto.java:12845)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata$1.parsePartialFrom(OrcProto.java:12934)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata$1.parsePartialFrom(OrcProto.java:12929)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.parseFrom(OrcProto.java:13045)
	at org.apache.hadoop.hive.ql.io.orc.ReaderImpl$MetaInfoObjExtractor.<init>(ReaderImpl.java:426)
	at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>(ReaderImpl.java:295)
	at org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader(OrcFile.java:197)
	at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:999)
	at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65)
	... 16 more
1 ACCEPTED SOLUTION

avatar
Super Guru

Looks like your table has ORC format so can you please try to set below properties and try it again?

set orc.compress.size=4096

set hive.exec.orc.default.stripe.size=268435456

View solution in original post

9 REPLIES 9

avatar
Super Guru

Looks like your table has ORC format so can you please try to set below properties and try it again?

set orc.compress.size=4096

set hive.exec.orc.default.stripe.size=268435456

avatar
Rising Star

actually its a select query which is creating this issue, so do you want me to set these properties while storing the data in the table??

avatar
Super Guru

Yes, Please follow these steps and let me know if you still face same issue. Also kindly mention your HDP version.

hive> set orc.compress.size=4096;

hive> set hive.exec.orc.default.stripe.size=268435456;

hive> your create table DDL;

hive> load data query in orc table;

hive> you select query;

avatar
Rising Star

hdp version is = HDP 2.1.2.0-402

I will also follow the steps given by you.

avatar
New Contributor

@Jitendra Yadav I am facing the same issue in HDP 2.6.5 . I follow the steps you have mentioned above. It works until some data size. I am processing ORC hive table with spark,where a file is store as one of the column in this table. Is there any permanent solution for this?

avatar
Rising Star

it worked for some days data BTW can you please elaborate the meaning of these properties?

avatar
New Contributor

It's a Hive bug - see Hive JIRA for details about the bug

https://issues.apache.org/jira/browse/HIVE-11592

avatar
New Contributor

even I have the same issue.

My hdp version is 2.3.4.23-3

Please advise

avatar
New Contributor

I am facing the same issue in HDP 2.6.5 . I follow the steps you have mentioned above. It works until some data size. I am processing ORC hive table with spark,where a file is store as one of the column in this table. Is there any permanent solution for this?