Member since
08-10-2018
21
Posts
0
Kudos Received
0
Solutions
02-06-2019
04:33 AM
@Siva A
@Geoffrey Shelton Okot
--fields-terminated-by "\020" is solved this issue and i shared this details under idea.find the link below! https://community.hortonworks.com/content/idea/236981/how-to-avoid-null-values-and-extra-columns-during.html
... View more
12-12-2018
01:34 AM
Thank you very much for the explanation regarding the work around to delete the row.
... View more
09-28-2018
02:04 PM
1 Kudo
@Hariprasanth Madhavan There are lot of ways to insert data into `HiveORC` table from NiFi. Method1: Using PutHiveStreaming Processor: Create transactional table and then feed the avro data to PutHivestreaming table. As HiveStreaming processor converts the avro format data into ORC format and regards to all delta files you can use major compaction to create one base file. Method2: ConvertAvroToORC in NiFi and store into HDFS: Use ConvertAvroToORC processor to convert the avro format data into ORC format. Store the data into HDFS and create an External hive table pointing to the same HDFS directory. Method3: Create Avro table and load from Avro table to ORC table: Based on the avro file we are having in NiFi we can create avro tables dynamically based on avro.schema. Create an orc table and after storing the avro data into HDFS use PutHiveQL processor to run insert into ORC table select * from Avro table Refer to this link for more details regards to create avro table dynamically. - If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.
... View more
09-18-2018
05:28 AM
@Hariprasanth Madhavan Please mark the answer, if you see it correct 🙂
... View more
09-05-2018
01:16 PM
Hi, I already metioned nifi.web.http(s).host=(my server ip) in nifi.properties. But problem still remains same. I mentioned my IP insteead of my FQDN.
... View more
08-25-2018
07:35 AM
@Hariprasanth Madhavan Nifi on start , tries to launch a bootstrap process which tries to connect to ports available on the server(host). In your case, it was trying to connect to port(42511) in first instance of startup and port(34170) in second instance of start up. But ports seems to have blocked, hence you are getting connection refused. Please check your firewall settings.
... View more
08-20-2018
05:39 AM
1 Kudo
Please check out below KB articles https://community.hortonworks.com/articles/46258/iot-example-in-apache-nifi-consuming-and-producing.html https://community.hortonworks.com/articles/178747/mqtt-with-apache-nifi.html Note: Please upvote and accept this answer if you found it useful
... View more
08-20-2018
07:49 PM
2 Kudos
@Hariprasanth Madhavan PutHiveQL processor is used to: Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute. - If you want to insert data into hive table directly then use PutHiveStreaming processor instead of PutHiveQL. Puthivestreaming processor expects the incoming data in AVRO format and table needs to Transactional enabled, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor. Flow: 1.ConsumeKafka
2.ConvertRecord //convert the outgoing flowfile into AVRO format
3.PutHiveStreaming Refer to this link for hive transactional tables and this link for ConvertRecord processor usage. - If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of issues.
... View more
10-24-2018
04:06 PM
https://community.hortonworks.com/articles/101679/iot-ingesting-gps-data-from-raspberry-pi-zero-wire.html
... View more
08-16-2018
12:30 PM
i too wanted it
... View more