Created on 12-09-2015 01:28 PM - edited 09-16-2022 02:52 AM
I just upgraded my cluster from 5.3.6 to 5.4.8, and can no longer access my ORCFile formatted tables from Hive. It's giving me the following exception. CDH 5.4.8 states that it's using Hive 1.1.0, although this specific error is reported as existing in Hive 1.2, and is correctable by upgrading the Kryo library to the latest version.
Has anyone else seen this? Is there a way of fixing it less severe than rolling back my installation?
org.apache.hive.com.esotericsoftware.kryo.KryoException: java.lang.IndexOutOfBoundsException: Index: 109, Size: 75 Serialization trace: operatorId (org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator) childOperators (org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator) childOperators (org.apache.hadoop.hive.ql.exec.vector.VectorFilterOperator) childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator) aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
Created 12-09-2015 02:40 PM
This issue was resolved by removing the 5.3.6 libraries from the cluster.
It seems that Cloudera leaves the old libraries in the path after the new libraries are installed.
Created 12-09-2015 02:40 PM
This issue was resolved by removing the 5.3.6 libraries from the cluster.
It seems that Cloudera leaves the old libraries in the path after the new libraries are installed.