Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to write flowfile content to virtual HDFS instance from local NIFI instance

avatar

Hi. I am trying to write data to hadoop cluster running in HDP 2.6.5 from a local nifi instance. But I receive the following error on write
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /home/maria_dev/.2d0282a9-8c6a-4bde-beb1-82eb1b061941 could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

In the HDFS data node logs, I see the below error logs

2023-04-18 12:05:13,386 WARN blockmanagement.BlockPlacementPolicy (BlockPlacementPolicyDefault.java:chooseTarget(385)) - Failed to place enough repl
icas, still in need of 1 to reach 1 (unavailableStorages=[], storagePolicy=BlockStoragePolicy{HOT:7, storageTypes=[DISK], creationFallbacks=[], repli
cationFallbacks=[ARCHIVE]}, newBlock=true) For more information, please enable DEBUG log level on org.apache.hadoop.hdfs.server.blockmanagement.Block
PlacementPolicy and org.apache.hadoop.net.NetworkTopology

 

Have validated follow things in Hadoop cluster

Data node is up and running and not running out of space, not in safe mode.

1 ACCEPTED SOLUTION

avatar
Master Collaborator

This is not a permission issue at this point but more of an issue between NameNode and DataNode.

I would request you start a new thread for HDFS.

Thank you 

View solution in original post

7 REPLIES 7

avatar
Community Manager

@iamlazycoder, Welcome to our community! To help you get the best possible answer, I have tagged in our Nifi experts @MattWho @ckumar @steven-matison @SAMSAL @cotopaul  who may be able to assist you further.

Please feel free to provide any additional information or details about your query, and we hope that you will find a satisfactory solution to your question.



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Master Collaborator

Looking at the error snipped, this seems to be an HDFS-level issue, but just to make sure

I assume you are be using NiFi: PutHDFS processor to write into the HDFS cluster thus, I would check following 

  • Check if processor is configured with latest copy of hdfs-site.xml and core-site.xml files under Hadoop configuration resources.
  • Try to write into same hdfs location from hdfs client outside of NiFi ? and see if this works or not to isolate if this is hdfs issue or configuration issue in NiFi processor end. 

Thank you 

 

 

avatar

@ckumar This exception happens even outside of Nifi. I get the same error while writing to HDFS location using HDFS client also. Can this error happen because of permission issue to the HDFS directory for the system user which runs my NIFI process or HDFS client. 

avatar
Master Collaborator

This is not a permission issue at this point but more of an issue between NameNode and DataNode.

I would request you start a new thread for HDFS.

Thank you 

avatar

@ckumar Thanks. 

avatar
Community Manager

@iamlazycoder Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.  



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar