Support Questions

Find answers, ask questions, and share your expertise

Unable to start the services while installing HDP 2.5.3 via Ambari

avatar
Expert Contributor

While trying to install HDP 2.5.3 on a 4 node cluster via Ambari Wizard, I passed all the steps and got till this point of 'install, start and test' but it gives this warning that many services failed to start. Please see the attachments for pictures. Did any one face this issue?


errorlog-hss.pngstart-failed.pngservice-start-fail-warning.png
1 ACCEPTED SOLUTION

avatar
Super Guru

@Geetha Anne

Sometimes the start will fail the first time, usually because of a timeout. If any one service fails, the other services will not be attempted; you see those represented with the yellow/orange color.

If you proceed, you should be able to start all services via the Ambari "Start All" on the Ambari Dashboard.

View solution in original post

7 REPLIES 7

avatar
Super Guru

@Geetha Anne

Sometimes the start will fail the first time, usually because of a timeout. If any one service fails, the other services will not be attempted; you see those represented with the yellow/orange color.

If you proceed, you should be able to start all services via the Ambari "Start All" on the Ambari Dashboard.

avatar
Expert Contributor
@Michael Young

Thanks. I went ahead and did a 'restart all' on all those services that failed to start. It worked for some of them .However for few others like history server, hive server2, Namenode etc., i am still getting the following error in the log: Could you help me there, Thanks

History Server:
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary @/usr/hdp/2.5.3.0-37/hadoop/mapreduce.tar.gz 'http://ganne-test0.field.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.3.0-37/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444' 1>/tmp/tmpCBuCHv 2>/tmp/tmp76U3W6' returned 52. curl: (52) Empty reply from server
100

Hiveserver 2:
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X GET 'http://ganne-test0.field.hortonworks.com:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs' 1>/tmp/tmpNYUL1Z 2>/tmp/tmpij3JWm' returned 7. curl: (7) Failed connect to ganne-test0.field.hortonworks.com:50070; Connection refused
000

avatar

its seems netowork connectivity issue "Failed connect to ganne-test0.field.hortonworks.com:50070; Connection refused" Please make sure in your access and Security section(for openstack) add the Ingress rule for port 50070.

avatar
Super Guru

@Geetha Anne

Port 50070 is the HDFS Name Node port. Try starting just HDFS see if that comes up ok. Then make sure Zookeeper is running fine. After that, try a start all.

avatar
New Contributor

Thanks, Michael. Do you mean to say, to proceed accepting the error through the upgrade process and then to restart all services afterwards?

I am having the same issue and upgrading from 2.4.0 to 2.5.3 using Ambari upgrade.

avatar
Expert Contributor

I went back to HDP 2.5.0 and no errors now. Looks like 2.5.3 has this start service issue. Thanks for the help

avatar
Guru

Its not related to HDP 2.5.0, I just encountered the same on 2.4.3. Was resolved for me after changing ambari.properties

agent.threadpool.size.max
client.threadpool.size.max

The value should be mapped to actual CPU core count. Also, increased the heap size for namenode and datanode to 2 GB from the default 1GB value.