I am new to cloudera, Through java i am trying to submit spark job to CDH5.5 present in my virtualbox, i got "failed to connect to /127.0.0.1:50010 for block" error. My /etc/hosts has 127.0.0.1 localhost and 192.168.0.143 quickstart.cloudera. I cant make localhost to 192.168.0.143 because i have other servers hardcoded to my localhost. To my Environment variable i added xml files like core-site.xml,hadoop-env.sh,hdfs-site.xml, mapred-site.xml,spark-defaults.xml,spark-env.sh,yarn-site.xml which i copied from /usr/lib/hadoop and /usr/lib/spark. Can somebody help me how to make hadoop to look for datanode in 192.168.0.143 than in my localhost and also should i change any properties in my xml files etc..,. As of now i can only think of running my java program from eclipse present in CDH but that is not what i want. Thanks in advance.
I also now copied hdfs-default.xml to my Environment variable and replaced 0.0.0.0 in it with quickstart.cloudera, but still I am facing the same issue, Thank you