I used Hortonworks Data Cloud to set up a 3 node cluster on Amazon. The setup process works great! After finishing the installation I uploaded some files through the Ambari "Files View" and everyting seems to work correctly. Then I configured the VPC to be accesible from a remote IP (all traffic, all protocols, all ports). When using a remote file upload I get this error message:
"could only be replicated to 0 nodes instead of minReplication (=1). There are 3 datanode(s) running and 3 node(s) are excluded in this operation."
The file is created but is empty. The file I am trying to send is exactly the same file as I was sending using the Ambari "Files view"
I know this is a know issue and believe it is configuration that is not set correctly. But I couldn't find a proper solution.
What I allready did, based on information I found around the web:
1. add hostnames to /etc/hosts of my local machine as well as the nodes
2. added several configuration items under "Custom hdfs-site":
- dfs.client.use.datanode.hostname = true
- dfs.datanode.use.datanode.hostname = true
- dfs.namenode.http-bind-host = 0.0.0.0
- dfs.namenode.https-bind-host = 0.0.0.0
- dfs.namenode.rpc-bind-host = 0.0.0.0
- dfs.namenode.servicerpc-bind-host = 0.0.0.0
What to do next? is there any high level security I can disable? or any other configuration item I can set to get me going?
Hi There, can you confirm a few things:
- What version of HWCloud are you using (1.8 or 1.11)?
- Which Cluster Type did you choose with HDP 2.5)?
- For your three nodes, that 1 Master + 3 Workers?