Created 04-30-2018 06:27 AM
Hi All ,
Need some help here with the "PutHiveStreaming" Processor. I have followed all the prerequisites to set up the data flow onto "PutHiveStreaming" processor however I continue to get the following error :-
Hive Streaming connect/write error, flow file will be penalized and routed to retry. org.apache.nifi.util.hive.HiveWriter$ConnectFailure: Failed connecting to EndPoint
I have seen this is a fairly common error across the Nifi Forums and I guess @Matt Burgess had posted a NAR file to fix this issue however the NAR file was of version 1.0 and in my case wont work as I am using version 1.4 which is a stand alone open source version of Apache Nifi ( not HDP ). Our requirement is to extract data from database and stream it into a partitioned table on Hive. We have used the streaming option to write onto HDFS using QueryDatabaseTable -> ConvertAvrotoOrc --> putHDFS however this would just land files onto HDFS , we need to directly insert data onto the partitioned table and make data available to the end users in real time.
Do you have any idea on how to get this processor working by any chance
Created 05-01-2018 02:37 AM
Ok , we just got this fixed , It seemed to be some issue with the thrift server on Hive. all good 🙂
Created 05-01-2018 02:37 AM
Ok , we just got this fixed , It seemed to be some issue with the thrift server on Hive. all good 🙂
Created 08-17-2018 08:26 PM
Abhinav,
I am seeing the same error, How did you fix this?
Created 09-18-2018 02:42 AM
Hi @Srinatha Anantharaman - we actually got in touch with our Hadoop admin and opened all firewalls for the thrift metastore server after which it worked, Hope this helps