Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

How to load data of snappy file into hive table?

New Contributor

Hi Guys,

I have a snappy file created in hdfs though the following nifi flow:QueryDatabaseTable(Mysql)->PutHDFS(Snappy compression)

I tried to load this snappy file into my hive table but I was unable to do so.This snappy file contains data of mysql table into compressed format.

Can any of you suggest a suitable hive query to load data of snappy file into hive table?

1 REPLY 1

@Parth Karkhanis

Your PutHDFS file should add an attribute to the flow file called ${hive.ddl}. You should be able to use this DDL statement to create the hive table. Work with it manually until you get the syntax correct.

In my working example, I send PutHdfs to ReplaceText where I append ${hive.ddl} with

${hive.ddl} LOCATION '/user/nifi/${folderName}/${tableName}/' tblproperties ("orc.compress" = "SNAPPY")

Then I send that to PutHiveQL Processor.

If this answer helps, please choose ACCEPT.