Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

ConsumeKafka to PuthiveQL can't send data in NiFi Directly

avatar

HDP-2.6.3.0 NIFI 1.60

How to insert data into the hive table from Kafka consumer?

KafkaConsumer ----->PutHiveQL

1 ACCEPTED SOLUTION

avatar
Master Guru
@Hariprasanth Madhavan

PutHiveQL processor is used to:

Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute.

-

If you want to insert data into hive table directly then use PutHiveStreaming processor instead of PutHiveQL.

Puthivestreaming processor expects the incoming data in AVRO format and table needs to Transactional enabled, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor.

Flow:

1.ConsumeKafka
2.ConvertRecord //convert the outgoing flowfile into AVRO format
3.PutHiveStreaming

Refer to this link for hive transactional tables and this link for ConvertRecord processor usage.

-

If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.

View solution in original post

1 REPLY 1

avatar
Master Guru
@Hariprasanth Madhavan

PutHiveQL processor is used to:

Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute.

-

If you want to insert data into hive table directly then use PutHiveStreaming processor instead of PutHiveQL.

Puthivestreaming processor expects the incoming data in AVRO format and table needs to Transactional enabled, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor.

Flow:

1.ConsumeKafka
2.ConvertRecord //convert the outgoing flowfile into AVRO format
3.PutHiveStreaming

Refer to this link for hive transactional tables and this link for ConvertRecord processor usage.

-

If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.