Created 03-16-2016 07:57 AM
My hive query is failing even after setting the property ipc.maximum.data.length to the end limit of int which is 2147483647 with the following stacktrace:
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large. May be malicious. Use CodedInputStream.setSizeLimit() to increase the size limit. at com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110) at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755) at com.google.protobuf.CodedInputStream.readRawByte(CodedInputStream.java:769) at com.google.protobuf.CodedInputStream.readRawVarint64(CodedInputStream.java:462) at com.google.protobuf.CodedInputStream.readUInt64(CodedInputStream.java:188) at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.<init>(OrcProto.java:4330) at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.<init>(OrcProto.java:4280) at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics$1.parsePartialFrom(OrcProto.java:4454) at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics$1.parsePartialFrom(OrcProto.java:4449) at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309) at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics.<init>(OrcProto.java:12224) at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics.<init>(OrcProto.java:12171) at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics$1.parsePartialFrom(OrcProto.java:12260) at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeStatistics$1.parsePartialFrom(OrcProto.java:12255) at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309) at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.<init>(OrcProto.java:12898) at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.<init>(OrcProto.java:12845) at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata$1.parsePartialFrom(OrcProto.java:12934) at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata$1.parsePartialFrom(OrcProto.java:12929) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49) at org.apache.hadoop.hive.ql.io.orc.OrcProto$Metadata.parseFrom(OrcProto.java:13045) at org.apache.hadoop.hive.ql.io.orc.ReaderImpl$MetaInfoObjExtractor.<init>(ReaderImpl.java:426) at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>(ReaderImpl.java:295) at org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader(OrcFile.java:197) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:999) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65) ... 16 more
Created 03-16-2016 02:36 PM
Looks like your table has ORC format so can you please try to set below properties and try it again?
set orc.compress.size=4096
set hive.exec.orc.default.stripe.size=268435456
Created 03-16-2016 02:36 PM
Looks like your table has ORC format so can you please try to set below properties and try it again?
set orc.compress.size=4096
set hive.exec.orc.default.stripe.size=268435456
Created 03-16-2016 06:29 PM
actually its a select query which is creating this issue, so do you want me to set these properties while storing the data in the table??
Created 03-17-2016 05:02 AM
Yes, Please follow these steps and let me know if you still face same issue. Also kindly mention your HDP version.
hive> set orc.compress.size=4096;
hive> set hive.exec.orc.default.stripe.size=268435456;
hive> your create table DDL;
hive> load data query in orc table;
hive> you select query;
Created 03-17-2016 05:56 PM
hdp version is = HDP 2.1.2.0-402
I will also follow the steps given by you.
Created 08-05-2019 12:08 PM
@Jitendra Yadav I am facing the same issue in HDP 2.6.5 . I follow the steps you have mentioned above. It works until some data size. I am processing ORC hive table with spark,where a file is store as one of the column in this table. Is there any permanent solution for this?
Created 03-21-2016 01:19 PM
it worked for some days data BTW can you please elaborate the meaning of these properties?
Created 03-22-2016 08:27 PM
It's a Hive bug - see Hive JIRA for details about the bug
Created 09-29-2016 08:45 PM
even I have the same issue.
My hdp version is 2.3.4.23-3
Please adviseCreated 08-06-2019 05:57 AM
I am facing the same issue in HDP 2.6.5 . I follow the steps you have mentioned above. It works until some data size. I am processing ORC hive table with spark,where a file is store as one of the column in this table. Is there any permanent solution for this?