Support Questions

Find answers, ask questions, and share your expertise

casting problem in parquet file format in hive

avatar
Rising Star

casting problem in parquet file format in hive

Earlier i have the datatype for one column as decimal and stored as parquet. Now i changed the datatype to bigint. After changing this, i couldn't able to select the data from the table. it showing error message

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable

Please help me on this.

Thanks!

2 REPLIES 2

avatar

Hi @Sundar Lakshmanan!

I think I just answered an old question made by you, not sure if it's related to this one, but..
https://community.hortonworks.com/questions/207791/failed-with-exception-javaioioexceptionorgapacheh...
Could you confirm if you are only having decimal values in this column? Usually, cast fails against mismatched datatype.

Hope this helps! 🙂

avatar
New Contributor

I too have the same scenario where in the column was decimal and updated to bigint, now getting error while querying the column.  the data type on the table and the parquet file are aligned.

 

Error: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable (state=,code=0)

 

If you have already resolved the issue, much appreciate if you could let know what worked for you. 

 

Thanks 

Dinesh