Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.DoubleWritable cannot be cast to org.apache.hadoop.io.IntWritable (state=,code=0)

java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.DoubleWritable cannot be cast to org.apache.hadoop.io.IntWritable (state=,code=0)

New Contributor

i am trying to load a parquet file stored at a hdfs path into a external table stored as parquet by specifying the same directory location of the data file. but when i am accessing it using select * query on beeline it is throwing the above exception. i have verified all the datatypes of the columns. please give some effective suggestion. i am able to see count(*) query.

Don't have an account?
Coming from Hortonworks? Activate your account here