Member since
12-14-2019
1
Post
0
Kudos Received
0
Solutions
06-02-2025
06:55 AM
It does looks like query failed with ClassCastException. It indicates that ( org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.LongWritable ) a mismatch between the data type Hive expects and the data type it's actually encountering while processing the query. From the Error trace , Hive treats a value as a DECIMAL(HiveDecimalWritable) but the metadata seems to be Long(LongWritable). One possible Reason might be Schema Mismatch: Hive table schema defines a column but the underlying data file (e.g., Parquet, ORC, ...) for that column actually contains DECIMAL Values. To validate , Run DESCRIBE FORMATTED <your_table_name>; for the table involved in the failing query. Pay close attention to the data types of all columns, especially those that might be involved in the conversion. Compare these Hive schema data types with the actual data types in your source data files. For example, if you're using Parquet, use tools like parquet-tools to inspect the schema of Parquet files. if you're using ORC , use hive --orcfiledump to inspect the schema of orc files. Also make sure that Serde's pointing to valid underlying file formats.
... View more