Created 02-26-2016 06:19 PM
Hi all,
Last week, I had set up a two-node HDP 2.3 on ec2 with just one master and a slave. Ambari installation went smoothly and it deployed and started the hadoop services.
I prefer to keep the cluster down when not in use for reasons of efficient cost utilisation. With the public IP changing with a reboot, ambari-server could not start hadoop services this week. Some services start if I were to launch them manually in sequence starting with HDFS. It will not start the services on reboot.
I believe the server has lost the connectivity after the change of public IP address. I am still not able to resolve the problem. I do not think changing the addresses in confg and other files are straightforward; they may be embedded in numerous files/locations.
I have got two elastic IP addresses and assigned them to two instances. I want to use the elastic IPs DNS name (example: 3c2-123.157.1.10.compute-1.amazonaws.com) to connect externally, while using the same (Elastic IP DNS) to let servers communicate with each other over the internal ec2 network. I wont be charged for the network traffic as long as my servers are in the same ec2 availability zone. I am aware there would be a tiny charge for the duration where the elastic IPs are not in use, which may be a few $ a month. I do not want to use the external elastic IP address (example: 123.157.1.10) directly for internal server access as I would be charged for network traffic.
Please advise the best way to resolve the hadoop services breakdown issue. Please note that I am also a Linux newbie. A detailed guidance is very much appreciated.
Thanks,
Created 03-13-2016 12:43 AM
I am cleaning them out and attempt a fresh install. I will close this thread and post a new one, if required. Thanks every one for the help
Created 02-26-2016 06:20 PM
typo: sorry. It is ec2-123.....
Created 02-26-2016 10:27 PM
something went wrong. I cannot start ambari-server. It is throwing "command not found". A clean up of ambari/hadoop services and a reinstall is the way forward.
Created 02-26-2016 11:06 PM
@S Srinivasa Is that a solution ?
Created 02-27-2016 12:29 AM
No. that was the recent problem. Please see my reply below. thanks
Created 02-26-2016 11:06 PM
Created 02-27-2016 12:17 AM
Thanks Neeraj Sabharwal.
I am sorry I am aware of that. We are somewhere in between, where ambari could not start hadoop services initially and now it is not available! I
am looking to (a) remove ambari-server/agent, hadoop services, etc and (b) reinstall, where I can map the elastic IP dns to address stop/start associated problems. A detailed guidance will be greatly appreciated.
Thanks again
SS
Created 02-27-2016 12:21 AM
Please note that i do not want to uninstall Java or MySql. They are working fine. thanks
Created 02-27-2016 12:37 AM
Created 02-29-2016 11:26 PM
Sorry for not coming back. I tried cleaning out Ambari/HDP before reinstall. While most of the components have been cleaned out, I can still see packs like hadoop, accumulo, tez, spark, etc in the directory.
I followed this post
https://gist.github.com/nsabharwal/f57bb9e607114833df9b
For some of the hadoop components/services, I am getting "no match for argument: hadoop*). Trying with or without 'sudo' does not work.