Hi,
I want to write spark dataframe into hive table.Hive table is partitioned on year and month and file format is parquet.
Currently i am writing dataframe into hive table using insertInto() and mode("append") i am able to write the data into hive table but i am not sure that is the correct way to do it? Also while writing i am getting "parquet.hadoop.codec.CompressionCodecNotSupportedException: codec not supported: org.apache.hadoop.io.compress.DefaultCodec" this exception.
Could you please help me on this?
Thanks for your time,