Support Questions

Find answers, ask questions, and share your expertise

Oracle to Hive table

avatar
Explorer

Hi I'm new to Nifi,

I'm trying to do a simple data migration from oracle database to Hive table using Nifi. I'm using Nifi 1.4 version.

I have also read some tutorial from here and some other thread from this forum. Any pointer, thanks.

76387-nifi-1.png

76388-cvrtavro2orc.png

76389-updateattribute.png

76390-puthdfs.png

76391-replacetext.png

1 ACCEPTED SOLUTION

avatar
Explorer

I got it fix the write permission. Now the orc file able to load to hive table but only into one column only. The source data contain 8 column and 14 row only. The result into hive table only into one column and it is in loop.

77386-insert-one-column.png

View solution in original post

5 REPLIES 5

avatar
Master Guru

Your PutHDFS processor is placing the data into Hadoop (in ORC format after ConvertAvroToORC) for use by Hive, so you don't also need to send an INSERT statement to PutHiveQL. Rather with the pattern you're using, you should have ReplaceText setting the content to a Hive DDL statement to create a table on top of the ORC file(s) location, or a LOAD DATA INPATH to load from the HDFS location into an existing Hive table.

avatar
Explorer

Thanks Matt for the pointer. I have change the replacetext with "Load data inpath..". Now I'm having issue that the Nifi using anonymous user and access denied to move the file.

error-permission-denied.png

avatar
Explorer

I got it fix the write permission. Now the orc file able to load to hive table but only into one column only. The source data contain 8 column and 14 row only. The result into hive table only into one column and it is in loop.

77386-insert-one-column.png

avatar
Master Guru

What does your CREATE TABLE statement look like, and does it match the schema of the file(s) (Avro/ORC) you're sending to the external location?

avatar
Explorer

It works fine now I change within the replacetext to ${hive.ddl} location '${absolute.hdfs.path}'