Created 07-24-2018 01:14 PM
Created 07-24-2018 01:15 PM
got the above error while select the records from hive table.
Created 07-29-2018 04:50 AM
Hey @Sundar Lakshmanan!
Could you share with us the following output?
describe formatted <table_name>;
Usually, this happens to mismatched types or some data overflowing the datatype.
Are you able to do a simple query?
select * from <table_name> limit 10;
Hope this helps!
Created 12-14-2019 05:21 AM
No i cant view the result of this simple query
Created 05-30-2025 04:58 AM
Instead of select *, you can use the column names you got from describe statement and identify the column with the issue (the type can be helpful there)
Created 06-02-2025 06:55 AM
It does looks like query failed with ClassCastException.
It indicates that ( org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.LongWritable ) a mismatch between the data type Hive expects and the data type it's actually encountering while processing the query.
From the Error trace , Hive treats a value as a DECIMAL(HiveDecimalWritable) but the metadata seems to be Long(LongWritable).
One possible Reason might be Schema Mismatch:
Hive table schema defines a column but the underlying data file (e.g., Parquet, ORC, ...) for that column actually contains DECIMAL Values.
To validate ,
Run DESCRIBE FORMATTED <your_table_name>; for the table involved in the failing query.
Pay close attention to the data types of all columns, especially those that might be involved in the conversion.
Compare these Hive schema data types with the actual data types in your source data files. For example,
if you're using Parquet, use tools like parquet-tools to inspect the schema of Parquet files.
if you're using ORC , use hive --orcfiledump to inspect the schema of orc files.
Also make sure that Serde's pointing to valid underlying file formats.