Support Questions

Find answers, ask questions, and share your expertise

Hadoop daemons not running. (Ambari installation)

avatar
Expert Contributor

I have successfully installed Ambari Service manually. I have installed most of the services which were listed in Ambari. However I do not see the hadoop deamons running. I don't even find the hadoop directory under /bin directory. I have the following questions:

1) When Ambari server is setup on a Centos machine, where does Ambari install Hadoop?

2) Under which folder Hadoop is installed?

3) Why is hadoop deamons not started automatically?

4) If hadoop is not installed, what are the next steps?

Please can someone help me, because I do not find anything documentation that helps me understand this?

1 ACCEPTED SOLUTION

avatar
Master Mentor
@Pradeep kumar

/usr/hdp/current --> you can see files there

You have to check on services from ambari console

View solution in original post

13 REPLIES 13

avatar
Master Mentor

@Pradeep kumar jps ...

find / -name jps

[root@phdns01 ~]# find / -name jps

/usr/jdk64/jdk1.7.0_67/bin/jps

/usr/jdk64/jdk1.8.0_60/bin/jps

Please dont confuse start-dfs.sh "old school"

Smoke Test HDFS

  1. Determine if you can reach the NameNode server with your browser:

    http://$namenode.full.hostname:50070

  2. Create the hdfs user directory in HDFS:
    su - $HDFS_USER
    hdfs dfs -mkdir -p /user/hdfs
  3. Try copying a file into HDFS and listing that file:
    su - $HDFS_USER
    hdfs dfs -copyFromLocal /etc/passwd passwd 
    hdfs dfs -ls
  4. Use the Namenode web UI and the Utilities menu to browse the file system.

avatar
Expert Contributor

Thanks everyone. I think I understand it better now :).

avatar
Master Mentor

@Pradeep kumar please accept best answer to close the thread.

avatar
Master Mentor

You can install jps manualky. You can run 'ps aux' and see processes you can run netstat and see ports running and yiu can run fsck to make sure hdfs is fine @Pradeep kumar