Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hadoop daemons not running. (Ambari installation)

avatar
Expert Contributor

I have successfully installed Ambari Service manually. I have installed most of the services which were listed in Ambari. However I do not see the hadoop deamons running. I don't even find the hadoop directory under /bin directory. I have the following questions:

1) When Ambari server is setup on a Centos machine, where does Ambari install Hadoop?

2) Under which folder Hadoop is installed?

3) Why is hadoop deamons not started automatically?

4) If hadoop is not installed, what are the next steps?

Please can someone help me, because I do not find anything documentation that helps me understand this?

1 ACCEPTED SOLUTION

avatar
Master Mentor
@Pradeep kumar

/usr/hdp/current --> you can see files there

You have to check on services from ambari console

View solution in original post

13 REPLIES 13

avatar
Master Mentor
@Pradeep kumar

/usr/hdp/current --> you can see files there

You have to check on services from ambari console

avatar
Master Mentor

@Pradeep kumar In my case it's under

[root@sandbox usersync]# ls -l /usr/hdp/

total 12

drwxr-xr-x 39 root root 4096 2015-10-27 13:22 2.3.2.0-2950

drwxr-xr-x 37 root root 4096 2016-02-01 11:49 2.3.4.0-3485

drwxr-xr-x 2 root root 4096 2015-10-27 12:30 current

[root@sandbox usersync]#

avatar
Expert Contributor

Hi Naveen.

I do not find the start-dfs.sh file under any folder under "/usr/hdp/" folder. I have checked the services in the ambari dashboard, where I can see that the MapReduce service is working. The configuration section also didn't help me much. My biggest issue is that, I don't see the hadoop running on this server. the JPS command is not working and when I looked through the folder, I also didn't find the start-dfs.sh file.

avatar
Master Mentor

@Pradeep kumar Let's focus on ambari.

Thats my ambari dashboard...You need to click

http://ambariserver:8080

replace ambariserver with your ambari host and then you will have full control on cluster start and stop

admin/admin is default login

1755-screen-shot-2016-02-03-at-80441-am.png

avatar
Expert Contributor

Hi Naveen,

I think you have not understood my question correctly. I am able to access Ambari Dashboard. I am able to see all nodes in my cluster and all nodes are working fine. Even the MapReduce Service is show as working fine (green). The question is, how do I ensure that hadoop is working fine? how do I do a smoke test to see if hadoop is working? This is where my problem started. On the command prompt if I type jps command, it doesn't detect it as a valid command. So the problem is about how to check if hadoop daemons are running and hadoop system in place?

avatar
Master Mentor

@Pradeep kumar Thank you for clarifying.

This will help a lot http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/ch_getti...

Every component has its own smoke test

avatar
Master Mentor

avatar
Expert Contributor

Thanks Neeraj. I have gone through the smoke test link you have given and also a few other documentation. It seems, they are confusing me further. Maybe I am not able to understand the documents and it is completely my mistake. But can you tell me in one simple sentence, why I cannot find the jps command and why I cannot find start-dfs.sh command. Please avoid giving any URLs if possible. Where is my understanding wrong?.

avatar
Master Mentor

You can configure jps in your OS via java config in RHEL its alternatives command. Then as hdfs user you can run jps. Start-dfs is not necessary as that will be controlled by Ambari @Pradeep kumar