Hello. I'm trying to do a small training task. I need to connect Hive with Kafka and load data in it. I've made external table for it but I don't understand how to load data in this table. Data's stored in .parquet files.
What you describe is a perfect Use Case for NiFi and something I have a lot of recent experience (kafka to hive). I have also recently worked with NiFi 1.10 and the new Parquet Readers, which can very easily convert parquet to csv & avro for external tables. Going deeper Nifi can also convert to ORC and insert into Hive native tables.
Perhaps if you comment more on the tools you want to use, myself or other contributors can provide more specific examples or details.