Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Zookeeper Smoke test failing when on the 9th step of Install, start and test ambari setup

avatar
Rising Star

I get the following screen while installing because of the error

2264-1583a432aa7fcefcb18894f98d4a593c26ff45e3efdcd72b80.png

The installation has not completed because of the zookeeper failing to pass the smoke test thus halting the processes further and giving 100% complete and with warning error on the installation page I have attached the zookeeper.txt file which shows the error.

1 ACCEPTED SOLUTION

avatar
Master Mentor
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
17 REPLIES 17

avatar
Master Mentor

@Kunal Gaikwad

I have experienced this several times.

Hit next

Move forward

Then you can run zk service check later once you are on dashboard

@Artem Ervits

It's new install...no Kerberos in the pic

avatar
Rising Star

@Neeraj Sabharwal This is a fresh installation of a cluster, there is no kerberos been configured here in this cluster.

Even I refreshed this page but the same this happens, I dont want to reconfigure and reinstall? can we do something here

avatar
Master Mentor

@Kunal Gaikwad

I know there is no kerberos. Don't refresh the page

Hit next and accept...you will come out install wizard then you can run zk check

avatar
Rising Star

I did next and completed the installations but many services are not running including. do you have a link which I can refer to? I am not sure what has caused the installation failure so very difficult to decide what step has to be done. I am installing it for the first time

avatar
Master Mentor

@Kunal Gaikwad

That's expected. Now, start HDFS, MapReduce, Yarn, Zookeeper, Hive and other services manually

FYI: Your install did not fail. It's service check only and I have experienced it many times.

Install finishes at 33% then service start etc. starts.

avatar
Rising Star

@Neeraj Sabharwal

Yes, I have been able to successfully turn up the services of HDFS but for MapReduce the history server gives an error. Nt sure why am i getting this error:

resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT -T /usr/hdp/2.3.4.0-3485/hadoop/mapreduce.tar.gz 'http://node1.dtitsupport247.net:50070/webhdfs/v1/hdp/apps/2.3.4.0-3485/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444' 1>/tmp/tmpZ5Y51c 2>/tmp/tmpnaJYGu' returned 7. curl: (7) Failed connect to node3.dtitsupport247.net:50075; No route to host

avatar
Master Mentor

@Kunal Gaikwad See this

Failed connect to node3.dtitsupport247.net:50075; No route to host

Make sure datanodes are up and connection between servers is working.

avatar
Rising Star

@Neeraj Sabharwal

Yes, the datanode and the namenode is up, but I noticed that from the hdfs service page from the quick link I am unable to see anything on the NameNode UI link, it says webpage not available, guess that is the reason why it is not able to establish a connection if I am not wrong? what can I do here?

my core-site.xml has

<value>hdfs://node1.dtitsupport247.net:8020</value>

even changing the port no. here from 8020 to 50070 does not help, none of the quicklinks are opening, says page does not exists

I tried running:

 ps -ef | grep hadoop | grep -P  'namenode|datanode|tasktracker|jobtracker'

I have attached the output for it: outpt1.txt

avatar
Master Mentor

@Kunal Gaikwad Is node1.dtitsupport247.net FQDN? Are you able to reach that url at all from you laptop?