Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Why I am having problem with Ambari Hadoop Services start?

avatar
Expert Contributor

Hi all,

Last week, I had set up a two-node HDP 2.3 on ec2 with just one master and a slave. Ambari installation went smoothly and it deployed and started the hadoop services.

I prefer to keep the cluster down when not in use for reasons of efficient cost utilisation. With the public IP changing with a reboot, ambari-server could not start hadoop services this week. Some services start if I were to launch them manually in sequence starting with HDFS. It will not start the services on reboot.

I believe the server has lost the connectivity after the change of public IP address. I am still not able to resolve the problem. I do not think changing the addresses in confg and other files are straightforward; they may be embedded in numerous files/locations.

I have got two elastic IP addresses and assigned them to two instances. I want to use the elastic IPs DNS name (example: 3c2-123.157.1.10.compute-1.amazonaws.com) to connect externally, while using the same (Elastic IP DNS) to let servers communicate with each other over the internal ec2 network. I wont be charged for the network traffic as long as my servers are in the same ec2 availability zone. I am aware there would be a tiny charge for the duration where the elastic IPs are not in use, which may be a few $ a month. I do not want to use the external elastic IP address (example: 123.157.1.10) directly for internal server access as I would be charged for network traffic.

Please advise the best way to resolve the hadoop services breakdown issue. Please note that I am also a Linux newbie. A detailed guidance is very much appreciated.

Thanks,

1 ACCEPTED SOLUTION

avatar
Expert Contributor

I am cleaning them out and attempt a fresh install. I will close this thread and post a new one, if required. Thanks every one for the help

View solution in original post

34 REPLIES 34

avatar
Expert Contributor

typo: sorry. It is ec2-123.....

avatar
Expert Contributor

something went wrong. I cannot start ambari-server. It is throwing "command not found". A clean up of ambari/hadoop services and a reinstall is the way forward.

avatar
Master Mentor

@S Srinivasa Is that a solution ?

avatar
Expert Contributor

No. that was the recent problem. Please see my reply below. thanks

avatar
Master Mentor

avatar
Expert Contributor

Thanks Neeraj Sabharwal.

I am sorry I am aware of that. We are somewhere in between, where ambari could not start hadoop services initially and now it is not available! I

am looking to (a) remove ambari-server/agent, hadoop services, etc and (b) reinstall, where I can map the elastic IP dns to address stop/start associated problems. A detailed guidance will be greatly appreciated.

Thanks again

SS

avatar
Expert Contributor

Please note that i do not want to uninstall Java or MySql. They are working fine. thanks

avatar
Expert Contributor

Sorry for not coming back. I tried cleaning out Ambari/HDP before reinstall. While most of the components have been cleaned out, I can still see packs like hadoop, accumulo, tez, spark, etc in the directory.

I followed this post

https://gist.github.com/nsabharwal/f57bb9e607114833df9b

For some of the hadoop components/services, I am getting "no match for argument: hadoop*). Trying with or without 'sudo' does not work.