Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

casting problem in parquet file format in hive

casting problem in parquet file format in hive

Contributor

casting problem in parquet file format in hive

Earlier i have the datatype for one column as decimal and stored as parquet. Now i changed the datatype to bigint. After changing this, i couldn't able to select the data from the table. it showing error message

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable

Please help me on this.

Thanks!

1 REPLY 1
Highlighted

Re: casting problem in parquet file format in hive

Hi @Sundar Lakshmanan!

I think I just answered an old question made by you, not sure if it's related to this one, but..
https://community.hortonworks.com/questions/207791/failed-with-exception-javaioioexceptionorgapacheh...
Could you confirm if you are only having decimal values in this column? Usually, cast fails against mismatched datatype.

Hope this helps! :)

Don't have an account?
Coming from Hortonworks? Activate your account here