- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
ConsumeKafka to PuthiveQL can't send data in NiFi Directly
- Labels:
-
Apache Hive
-
Apache Kafka
-
Apache NiFi
Created ‎08-20-2018 04:41 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
HDP-2.6.3.0 NIFI 1.60
How to insert data into the hive table from Kafka consumer?
KafkaConsumer ----->PutHiveQL
Created ‎08-20-2018 07:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
PutHiveQL processor is used to:
Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute.
-
If you want to insert data into hive table directly then use PutHiveStreaming processor instead of PutHiveQL.
Puthivestreaming processor expects the incoming data in AVRO format and table needs to Transactional enabled, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor.
Flow:
1.ConsumeKafka 2.ConvertRecord //convert the outgoing flowfile into AVRO format 3.PutHiveStreaming
Refer to this link for hive transactional tables and this link for ConvertRecord processor usage.
-
If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.
Created ‎08-20-2018 07:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
PutHiveQL processor is used to:
Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute.
-
If you want to insert data into hive table directly then use PutHiveStreaming processor instead of PutHiveQL.
Puthivestreaming processor expects the incoming data in AVRO format and table needs to Transactional enabled, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor.
Flow:
1.ConsumeKafka 2.ConvertRecord //convert the outgoing flowfile into AVRO format 3.PutHiveStreaming
Refer to this link for hive transactional tables and this link for ConvertRecord processor usage.
-
If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.
