I am struggling with an error. I am trying to write a file on HDFS from NiFi but its failing with error below. I am using Hortonworks Sandbox(HDP) and NiFi(HDF) both hosted on ESXi server.
Caused by: org.apache.hadoop.ipc.RemoteException: File /user/nifi/.temp1.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
1. Lot of space available in DFS, means HDFS health is in good state. I am able to execute hadoop fs -put <filename> /tmp and write considerable file of good size onto HDFS.
2. Property set to true <property> <name>dfs.client.use.datanode.hostname</name> <value>true</value> </property>
3. hdfs-site.xml,core-site.xml copied to HDF
4. HDP hostname entry copied to /etc/hosts on NiFi Machine
Anyone who has encountered a similar problem. I noticed that Nifi is also not able to read data also(through GetHDFS). The problem which I am anticipating is Nifi Client is able to connect to NameNode but its not able to connect to Data Node since NameNode sends it a invalid private address (I am using using Hortonworks Sandbox) and that's why NiFi client is unable to read/write data from HDP