Created on 10-06-2017 09:29 AM - edited 09-16-2022 05:22 AM
Hi,
I currently have data sitting in an HDFS location at, say, /location. The data is paritioned by YEAR/MONTH/DAY, and the subfolder structure looks like YEAR=2017/MONTH=8/DAY=2. I am attempting to create an external table on this data, but upon doing so the partitioning is not being recognized. The two commands I've tried are:
drop table if exists db.table;
create external table db.table like parquet '/location/file.parquet' partitioned by (YEAR int, MONTH int, DAY int) stored as parquet location '/location';
alter table db.table recover partitions;
compute incremental stats db.table;
And...
drop table if exists db.table
create external table db.table(
field1 string,
field2 string,
...
) partitioned by (YEAR int, MONTH int, DAY int) stored as parquet location '/location/';
alter table db.table recover partitions;
compute incremental stats db.table;
In both cases, I end up with an empty table that is correctly partitioned. Calling invalidate metadata; after the fact did not resolve the issue. I've verifified that the impala user is on the facl lists for these areas. Does anyone know why it would not be finding the data?
I should point out that if I ignore partitioning and instead just try and build a table on top of data from one day (IE. YEAR=2017/MONTH=8/DAY=2), the data shows.