Created 01-04-2018 06:07 AM
Hello,
I am trying to connect HDFS of remote server (Different company) using PutHDFS of NiFi.
I am already connecting my organisation hadoop using core-site.xml and hdfs-site.xml and I want to transfer some files from my org hadoop different org hadoop.
They have shared core-site.xml and hdfs-site.xml , but I am not able to connect using that.
So just wanted to know what is the right way to do this connection?
Thanks,
Mahendra
Created on 01-04-2018 06:12 AM - edited 08-18-2019 02:16 AM
You can put the core-site.xml and hdfs-site.xml in any path inside your Nifi Host. Then specify the comma separated path in the "Hadoop Configuration Resources" setting of PutHDFS. You will need to make sure that NiFi process has at least "read" access to these Files.
/PATH/TO/core-site.xml,/PATH/TO/hdfs-site.xml
Example:
.
Created on 01-04-2018 06:12 AM - edited 08-18-2019 02:16 AM
You can put the core-site.xml and hdfs-site.xml in any path inside your Nifi Host. Then specify the comma separated path in the "Hadoop Configuration Resources" setting of PutHDFS. You will need to make sure that NiFi process has at least "read" access to these Files.
/PATH/TO/core-site.xml,/PATH/TO/hdfs-site.xml
Example:
.
Created 01-04-2018 06:20 AM
Thanks for the reply @Jay Kumar SenSharma.
I am able to connect local HDFS in this way, but while connecting remote server HDFS I am facing issue.
Just wanted to know can we directly use the xmls shared by hadoop team or do we need any modification in that?
Created 01-04-2018 06:23 AM
We can directly use the core-site.xml / hdfs-site.xml files share by other teams in NiFi.
However if the "core-site.xml / hdfs-site.xml" files are from secured cluster (like Kerberos enabled) then we will need to make sure that the NiFi has valid Kerberos Ticket while interacting with a secure HDFS.
.
If you are getting any error while connecting to HDFS from NiFi then it will be good to see the complete error trace. Can you please share the failure logs.
Created 01-08-2018 07:49 AM
Good to know that you are now able to connect to remote server hosted HDFS using provided hdfs-site.xml ...etc files.
However your latest query is bit extended, and it will be great if you ask that query as part of a new HCC Thread. It helps us in maintaining the HCC well when one thread has a dedicated query with a dedicated answer, else multiple issues in a single thread might sometimes confuses some HCC users to quickly find the answer.
.
As the issue which is originally mentioned in this HCC Thread is resolved, hence it will be also great if you can mark this HCC thread as Answered by clicking on the "Accept" Button on the correct answer. That way other HCC users can quickly find the solution when they encounter the same issue.
Created 01-08-2018 07:21 AM
Now I am able to connect remote server hosted HDFS using provided hdfs-site.xml, core-site.xml,krb5.conf, keytab file & principle.
Thanks for your guidance.
But I am facing one more issue, that is - I have to connect 2 hadoop servers (getHDFS from one server & putHDFS to another server). So in this case how can I configure two different krb5.conf files of two hadoop servers in nifi.properties?
I have configured like below in nifi.properties file to connect to one server -
nifi.kerberos.krb5.file=C:/Platform_2.0/NiFi/krb5.conf
Is there any way to configure two krb5.conf files in nifi.properties or can we merge the two files?