Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Put HDFS Error writing to Remote HDFS

New Contributor

Hi,

I have two EC2 machine the first one is HDF machine and the another one is HDP machine, I could put data into hbase in HDP, but is not possible to putHDFS into the HDP server, see below you will see the error that I'm having and my configuration,

the error is Failed to write HDFS due to prg.apache.nifi.processor.exception.ProcessException: IOException thrown from PutHDFS[id=]: File /user/openagro/.197111111 could onlu be replicated to 0 nodes instead of minReplication(=1). there are 1 datanode(s) runninf and 1 node(s) are excluded in this operation

errorlog.png

puthdfsconfiguration.png

thanks you for your help

3 REPLIES 3

Super Guru

New Contributor

Hi @Shu,

I formatted the namenode, I started again but the putHDFS is still doing the same error, do you have any idea, do i need to give permissions to the remote user or something like that.

Also I changed my sandbox-hdp address to public addres and the namenode is not able to start.

thanks a lot for your help

New Contributor

Hi,

I found that the port 50010 is not the only one that you need to open, because the natanode use another port 50470 and other ones, look through your config over ambari and you will find those ports, or hdfs-site.xml and core-site.xml

thanks

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.