Member since
12-26-2017
40
Posts
9
Kudos Received
0
Solutions
08-05-2019
12:23 PM
I want to merge source records from hive existing records based on some key and if records exist in hive table then only update the value of particular records. Example : i have a table in hive "Test" with records-1 (id : 1, name : abc, age : 20), records-2 (id : 2, name : xyz, age : 22) and i am getting the records from source (live sensor ). The same records i am getting from the sensor and doing some merge based on id, so if the records matched with existing records from hive table then only update new value else insert the key and value both. I want the way in java coding only, query i have for merging the records. merge into customer_name
using ( select * from all_updates) sub
on sub.id = customer_name.id
when matched then update set name = sub.name, state = sub.state
when not matched then insert values (sub.id, sub.name, sub.state); I did in java but not finding the way how to resolve this issue. Could anybody suggest me the way. Thanks.
... View more
Labels:
01-15-2019
03:06 PM
@Surendra Shringi Is this a single NiFi instance or part of . NiFi cluster? I suggest taking a look at your nfii.properties and make sure you have configured valid/resolvable hostnames for the following properties: - nifi.web.http.host or nifi.web.https.host --- (Use hostname command to verify hostname of NiFi server and try pinging those hostnames form command line on NiFi server. Was resolution successful and was ping successful) nifi.remote.input.host nifi.cluster.node.address --- (if clustered setup) nifi.zookeeper.connect.string --- (if clustered setup) - Verify there are no misconfigurations in /etc/hosts file that would affect resolution of above configured hostnames. - Thank you, Matt
... View more
09-08-2018
06:51 AM
Hi thanks for quick reply @Felix Albani I tried but i didn't find out any way to connect. I am getting below error. Opened transport
Fatal error: Uncaught exception
'Thrift\Exception\TApplicationException' with message 'Invalid method
name: 'OpenSession'' in
/var/www/html/hs2/hive/org/apache/hive/service/cli/thrift/TCLIService.php:89
Stack trace:
#0
/var/www/html/hs2/hive/org/apache/hive/service/cli/thrift/TCLIService.php(56):
org\apache\hive\service\cli\thrift\TCLIServiceClient->recv_OpenSession()
#1 /var/www/html/hs2/HS2Client.php(40):
org\apache\hive\service\cli\thrift\TCLIServiceClient->OpenSession(Object(org\apache\hive\service\cli\thrift\TOpenSessionReq))
#2 {main}
thrown in /var/www/html/hs2/hive/org/apache/hive/service/cli/thrift/TCLIService.php on line 89 Could you pls let me know any solution.
... View more
09-05-2018
12:55 PM
Thanks for such a nice community. I run bin/nifi.sh run and now i am able to run NiFi in background. Thanks @Matt Clarke
... View more
08-29-2018
11:17 AM
Hi, I want to do below process in NiFi. Step 1) ConsumeMQTT using mosquitto Step 2) UpdateAttributes Step 3) EvaluateJsonPath Step 4) SplitJson - will produce the separate flowfile for each JSON array Step 5) PutHbaseJSON --> Store the Json Array to HBASE Hope you can now simulate the same or i'm having json array as {"weather": [ {"id":"1","latitude":"13.08","longitude":"80.27","temprature":"39","pressure":"1009mb","visibility":"7km","wind":"4mph","cloudcover":"50%","momentmagnitude":"9.1"}, {"id":"2","latitude":"14.08","longitude":"80.27","temprature":"39","pressure":"1009mb","visibility":"7km","wind":"4mph","cloudcover":"50%","momentmagnitude":"9.1"}, {"id":"3","latitude":"15.08","longitude":"80.27","temprature":"39","pressure":"1009mb","visibility":"7km","wind":"4mph","cloudcover":"50%","momentmagnitude":"9.1"}, {"id":"4","latitude":"16.08","longitude":"80.27","temprature":"39","pressure":"1009mb","visibility":"7km","wind":"4mph","cloudcover":"50%","momentmagnitude":"9.1"}, {"id":"5","latitude":"17.08","longitude":"80.27","temprature":"39","pressure":"1009mb","visibility":"7km","wind":"4mph","cloudcover":"50%","momentmagnitude":"9.1"} ] } i'm trying to insert the same to hbase, how to get the each array into hbase row. Any help would be appreciated, Thanks !!
... View more
Labels:
08-17-2018
11:19 AM
Thanks for this great community. could you suggest any work flow to get data from GPS device to cassandra table with using Kafka.
... View more
02-02-2018
03:55 PM
Thanks @kgautam. I will try the same and let you know.
... View more
01-13-2018
05:51 AM
2 Kudos
Thanks @Shu For such a nice help.
... View more
01-13-2018
05:03 AM
1 Kudo
Thanks for your overwhelming response, this will help me a great.
... View more
08-29-2018
06:42 AM
Hi @Patrick Maggiulli I tried the same flow, i am putting the data into HBase from HTTP. I have one CSV file that contains field (ID,Movie,Type), In GetFile processor i am taking this file and flow remains same as yours. In UpdateAttribute i am giving schema.name is "MoviesRecord". But getting error in ConvertRecord processor that ConvertRecord is failed to process StandardFlowFileRecord "will route to failure Field field_0 can not be null. " Any help that would be great. Thanks
... View more
08-29-2018
04:10 AM
Thanks for your quick reply. I tried but unable to figure out the problem. The same error occurs while putting data into HBase.
... View more
05-30-2019
01:11 PM
Hi @Andrew Lim , Thanks for a detailed explaination. Following your article i am trying to convert a csv to json using convertrecord processor and then load the merged json (output of convertrecord) to redshift using copy from a file.my merged json is stored in s3.I am getting error that csv is not in json format, could you please suggest how to load these records all at once to redshift?
... View more
08-17-2018
11:15 AM
We are using unsecure Kafka connection and not using MiNiFi, instead of that we are using MQTT. this is our NiFi flow ..... MQTT --> PublishKafka --> ConsumeKafka --> PutCassandraQL. What you suggest to use NiFi flow for below process ... MQTT --> PublishKafka --> ConsumeKafka --> PutCassandraQL
... View more
08-16-2018
12:30 PM
i too wanted it
... View more
03-16-2018
03:15 PM
now use record processor
... View more