- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Parquet schema error
- Labels:
-
Apache Hadoop
-
Apache Hive
-
HDFS
Created 07-14-2021 04:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I had a table that it vanished after an issue. I created again the table and then recover the partitions in order to read again the files that were in hdfs hive direcotry. Although all of a sudden now, it throws me the below error:
File 'path/data.0.parq' has an incompatible Parquet schema for column 'db.table.parameter_11'. Column type: STRING, Parquet schema: optional int64 amount [i:10 d:1 r:0]
Created 08-23-2021 04:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
can you use beeline and type the below command then recreate the table :
set parquet.column.index.access=false;
this should make hive not use the index of your create table statement to map the data in your files, but instead it will use the columns names .
hope this works for you.
Best Regards
Created 08-23-2021 04:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
can you use beeline and type the below command then recreate the table :
set parquet.column.index.access=false;
this should make hive not use the index of your create table statement to map the data in your files, but instead it will use the columns names .
hope this works for you.
Best Regards
