Member since
11-09-2017
2
Posts
0
Kudos Received
0
Solutions
06-02-2025
06:55 AM
It does looks like query failed with ClassCastException. It indicates that ( org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.LongWritable ) a mismatch between the data type Hive expects and the data type it's actually encountering while processing the query. From the Error trace , Hive treats a value as a DECIMAL(HiveDecimalWritable) but the metadata seems to be Long(LongWritable). One possible Reason might be Schema Mismatch: Hive table schema defines a column but the underlying data file (e.g., Parquet, ORC, ...) for that column actually contains DECIMAL Values. To validate , Run DESCRIBE FORMATTED <your_table_name>; for the table involved in the failing query. Pay close attention to the data types of all columns, especially those that might be involved in the conversion. Compare these Hive schema data types with the actual data types in your source data files. For example, if you're using Parquet, use tools like parquet-tools to inspect the schema of Parquet files. if you're using ORC , use hive --orcfiledump to inspect the schema of orc files. Also make sure that Serde's pointing to valid underlying file formats.
... View more
11-09-2017
09:42 AM
Hi, as usual, it depends of what you need... The cloudera VM has 1 node with everything and it allows you to see it... A quite simple cluster could have 2..3 MVs for CM & masters and at least 3 VMs for Workers. As I said and as you can imagine, it depends what you want to test on it Believe me, you really need a Cloudera admin to get what you want... In another thread refered to this blog I hope, this will help you
... View more