Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Using SORT BY with externally loaded parquet files (ex IMPALA LOAD)

avatar
Explorer

Hi All,

 

 I was looking at this BLOG https://blog.cloudera.com/blog/2017/12/faster-performance-for-selective-queries/

where we see that using "SORT BY"  during table creation we can improve impala query performance . 

     As mentioned in the blog this works only if we use "INSERT" or "CREAT table with select " . Our use case is we create parquet file externally and UPLOAD it onto HDFS and then use IMPALA  " LOAD DATA" command. Is there a way we can use "SORT BY" mechanism with this model of loading parquet files.

 

Thanks,

Raju.

         

6 REPLIES 6

avatar

The external tool that you are using would have to support ordering the data by those columns. E.g. if you're using hive, it supports SORT BY. If you're writing it from some custom code, that code would need to sort the data before writing it to parquet.

avatar
Explorer

I am using parquet-cpp to write parquet file and the upload it to HDFS using web-hdfs . At the end use "LOAD DATA" command to load iparquet file nto into impla.

  Is there  any option in parquet-cpp to sort it out.

avatar

I'm not sure that parquet-cpp has any builtin way to sort data - your client code might have to do the sorting before feeding it to parquet-cpp

avatar
Explorer

If I am using dictionary encoding for the column, do I still need to write data in sorted order  in parquet file .

avatar
I don't think dictionary encoding makes a different to the effectivess of min-max stats, because the data is still going to be in the file in the same order regardless.

avatar
Explorer

Thanks so much for help , I will try out sorting and validate query performance.