Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Where to find core-site.xml and hdfs-site.xml for NiFi PutHDFS processor ?

avatar
Master Collaborator

Hello,

I am trying to connect HDFS of remote server (Different company) using PutHDFS of NiFi.

I am already connecting my organisation hadoop using core-site.xml and hdfs-site.xml and I want to transfer some files from my org hadoop different org hadoop.

They have shared core-site.xml and hdfs-site.xml , but I am not able to connect using that.

So just wanted to know what is the right way to do this connection?

Thanks,

Mahendra

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Mahendra Hegde

You can put the core-site.xml and hdfs-site.xml in any path inside your Nifi Host. Then specify the comma separated path in the "Hadoop Configuration Resources" setting of PutHDFS. You will need to make sure that NiFi process has at least "read" access to these Files.

/PATH/TO/core-site.xml,/PATH/TO/hdfs-site.xml

Example:

45718-puthdfs.png

.

View solution in original post

5 REPLIES 5

avatar
Master Mentor

@Mahendra Hegde

You can put the core-site.xml and hdfs-site.xml in any path inside your Nifi Host. Then specify the comma separated path in the "Hadoop Configuration Resources" setting of PutHDFS. You will need to make sure that NiFi process has at least "read" access to these Files.

/PATH/TO/core-site.xml,/PATH/TO/hdfs-site.xml

Example:

45718-puthdfs.png

.

avatar
Master Collaborator

Thanks for the reply @Jay Kumar SenSharma.

I am able to connect local HDFS in this way, but while connecting remote server HDFS I am facing issue.

Just wanted to know can we directly use the xmls shared by hadoop team or do we need any modification in that?

avatar
Master Mentor

@Mahendra Hegde

We can directly use the core-site.xml / hdfs-site.xml files share by other teams in NiFi.

However if the "core-site.xml / hdfs-site.xml" files are from secured cluster (like Kerberos enabled) then we will need to make sure that the NiFi has valid Kerberos Ticket while interacting with a secure HDFS.

.

If you are getting any error while connecting to HDFS from NiFi then it will be good to see the complete error trace. Can you please share the failure logs.

avatar
Master Mentor

@Mahendra Hegde

Good to know that you are now able to connect to remote server hosted HDFS using provided hdfs-site.xml ...etc files.

However your latest query is bit extended, and it will be great if you ask that query as part of a new HCC Thread. It helps us in maintaining the HCC well when one thread has a dedicated query with a dedicated answer, else multiple issues in a single thread might sometimes confuses some HCC users to quickly find the answer.

.

As the issue which is originally mentioned in this HCC Thread is resolved, hence it will be also great if you can mark this HCC thread as Answered by clicking on the "Accept" Button on the correct answer. That way other HCC users can quickly find the solution when they encounter the same issue.

avatar
Master Collaborator

@Jay Kumar SenSharma

Now I am able to connect remote server hosted HDFS using provided hdfs-site.xml, core-site.xml,krb5.conf, keytab file & principle.

Thanks for your guidance.

But I am facing one more issue, that is - I have to connect 2 hadoop servers (getHDFS from one server & putHDFS to another server). So in this case how can I configure two different krb5.conf files of two hadoop servers in nifi.properties?

I have configured like below in nifi.properties file to connect to one server -

nifi.kerberos.krb5.file=C:/Platform_2.0/NiFi/krb5.conf

Is there any way to configure two krb5.conf files in nifi.properties or can we merge the two files?