Created 07-24-2018 12:47 PM
casting problem in parquet file format in hive
Earlier i have the datatype for one column as decimal and stored as parquet. Now i changed the datatype to bigint. After changing this, i couldn't able to select the data from the table. it showing error message
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
Please help me on this.
Thanks!
Created 07-29-2018 05:41 AM
I think I just answered an old question made by you, not sure if it's related to this one, but..
https://community.hortonworks.com/questions/207791/failed-with-exception-javaioioexceptionorgapacheh...
Could you confirm if you are only having decimal values in this column? Usually, cast fails against mismatched datatype.
Hope this helps! 🙂
Created 11-19-2020 03:18 AM
I too have the same scenario where in the column was decimal and updated to bigint, now getting error while querying the column. the data type on the table and the parquet file are aligned.
Error: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable (state=,code=0)
If you have already resolved the issue, much appreciate if you could let know what worked for you.
Thanks
Dinesh