Created 11-06-2023 06:10 AM
I had imported a 100GB+ of parquet data into a manually derived schema table in Hue. I had 14 of the 46 columns defined and imported the data successfully. My next step was to define all columns. When I imported the data, I got an error:
java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: Cannot inspect org.apache.hadoop.io.ArrayWritable
The parquet files no longer appear in the hdfs filesystem. Is there a way to find out what happened to the data? What went wrong? Can it be corrected?
Created 11-06-2023 12:33 PM
@Jarran Welcome to the Cloudera Community!
To help you get the best possible solution, I have tagged our Hue experts @dturnau @jrm who may be able to assist you further.
Please keep us updated on your post, and we hope you find a satisfactory solution to your query.
Regards,
Diana Torres,Created 11-09-2023 03:22 PM
@Shmoo @mszurap By any chance do you have any insights here? Thanks!
Regards,
Diana Torres,