Member since
03-01-2018
38
Posts
3
Kudos Received
0
Solutions
03-16-2018
12:55 PM
@hema moger Yes, you are right. Once you successfully fetch any file from sftp server then data provenance will be updated accordingly.
... View more
03-08-2018
01:04 PM
@hema moger Great ! Could you the 'Accept' the answer by Clicking on Accept button below, That would be great help to Community users to find solution quickly for this kind of error.
... View more
03-01-2018
06:50 PM
@hema moger Great if it's a Linux server then create a passwordless login between the remote server and the edge node. First, update your /etc/hosts so that the remoter server is pingable from your edge node check the firewall rules and make sure you don't have a DENY Here is the walkthrough See attached pic1.jpg In my case the I have a centos server GULU and a Cloudera Quickstart VM running in Oracle VM virtual box because they are on the same network it's easy GULU Remote server: I want to copy the file test.txt which is located in /home/sheltong/Downloads [root@gulu ~]# cd /home/sheltong/Downloads [root@gulu Downloads]# ls
test.txt Edge node or localhost: [root@quickstart home]# scp root@192.168.0.80:/home/sheltong/Downloads/test.txt .
The authenticity of host '192.168.0.80 (192.168.0.80)' can't be established.
RSA key fingerprint is 93:8a:6c:02:9d:1f:e1:b5:0a:05:68:06:3b:7d:a3:d3.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '192.168.0.80' (RSA) to the list of known hosts.
root@192.168.0.80's password:xxxxxremote_server_root_passwordxxx
test.txt 100% 136 0.1KB/s 00:00 Validate that the file was copied [root@quickstart home]# ls cloudera test.txt There you are I hope that helped
... View more
03-12-2018
04:17 AM
@Andrew Lim 1)My requirement is We have CSV data in one of the remote machine. every one hour csv data(huge data) is generating in remote machine. So i used SFTP for that. From there i need to put data to Kafka topics.So i used Publish Kafka record.I have not used Json conversion which is written in above article.But i am going to use it. 2) Yes i got the flow from your article. I have missed schema.name attribute and schema registry bcoz i dont understand what to mention in this.do i need to mention column name which is there in input file. 3) and I am just started learning kafka. and i created partitions in one of topic, i have no idea how to load data to specific partitions using nifi. If you have any suggestion better than this can you guide me ?
... View more
03-09-2018
08:02 AM
@Andrew Grande I have the same issue but for the GETSFTP processor .Filenotfound exception. screenshot-from-2018-03-09-13-24-26.png
... View more