Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Nifi use case for ingesting data in Hive

Hello All,

I want to ingest csv file into Hive table. But, csv file format changes frequently i.e I get different file each time as a csv. I would like to develop a workflow which handles any kind of csv file. Does nifi provides any such options? if yes, what should be the process that needs to be followed?

2 REPLIES 2

@Shu,

Hi Shu, need your help. I have to ingest csv to hive and below is the format

getFile -->inferavroschema-->convercsvtoavro-->putHDFS-->ReplaceText-->PutHiveQL

I am getting data till putHDFS and it is in avro format. I am using replacetext to create INSERT script its not fetching any records.

avrofile looks like this

[{"name":"field_0","type":"string","doc":"Type inferred from 'ID'"},{"name":"field_1","type":"string","doc":"Type inferred from 'CITY_NAME'"},{"name":"field_2","type":"string","doc":"Type inferred from 'ZIP_CD'"},{"name":"field_3","type":"string","doc":"Type inferred from 'STATE_CD'"}]

using below insert command

INSERT INTO aaa (field_0, field_1, field_2, field_3) VALUES ('${field_0}', '${field_1}', '${field_2}', '${field_3}')

but its not working. sql statement generated is,

insert into aaa values (,,,)

@Matt Burgess,

Hello Matt, need your help on the above. I tried multiple things and nothing seems to be working. I am using nifi 1.8

when i use JSONTOSQL, i get insert into aaa values (?,?,?,?) how do i pass the values here?

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.