Member since
01-28-2015
61
Posts
35
Kudos Received
0
Solutions
02-24-2016
08:59 AM
@Geoffrey Shelton Okot I have checked the configs I have seen it all fine except i was unable to navigate to http://node1.dtitsupport247.net:50070/webhdfs/ even locally so I checking for the Knox services, I could see the Knox gateway been installed and all the user and directories in place. But i did a yum install knox, changed the port from 50070 back to 8020 and check few config files also referring some related issues in the forum and restarted the cluster and bought up all the components up! Its working now! Finally!
... View more
02-23-2016
12:06 PM
@Geoffrey Shelton Okot Yes, each of those steps have been followed the issues is still the same.
... View more
02-22-2016
11:45 AM
@Neeraj Sabharwal Yes, the datanode and the namenode is up, but I noticed that from the hdfs service page from the quick link I am unable to see anything on the NameNode UI link, it says webpage not available, guess that is the reason why it is not able to establish a connection if I am not wrong? what can I do here? my core-site.xml has <value>hdfs://node1.dtitsupport247.net:8020</value>
even changing the port no. here from 8020 to 50070 does not help, none of the quicklinks are opening, says page does not exists I tried running: ps -ef | grep hadoop | grep -P 'namenode|datanode|tasktracker|jobtracker' I have attached the output for it: outpt1.txt
... View more
02-22-2016
10:21 AM
1 Kudo
@Neeraj Sabharwal Yes, I have been able to successfully turn up the services of HDFS but for MapReduce the history server gives an error. Nt sure why am i getting this error: resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT -T /usr/hdp/2.3.4.0-3485/hadoop/mapreduce.tar.gz 'http://node1.dtitsupport247.net:50070/webhdfs/v1/hdp/apps/2.3.4.0-3485/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444' 1>/tmp/tmpZ5Y51c 2>/tmp/tmpnaJYGu' returned 7. curl: (7) Failed connect to node3.dtitsupport247.net:50075; No route to host
... View more
02-22-2016
10:00 AM
1 Kudo
I did next and completed the installations but many services are not running including. do you have a link which I can refer to? I am not sure what has caused the installation failure so very difficult to decide what step has to be done. I am installing it for the first time
... View more
02-19-2016
03:25 PM
1 Kudo
@Neeraj Sabharwal This is a fresh installation of a cluster, there is no kerberos been configured here in this cluster. Even I refreshed this page but the same this happens, I dont want to reconfigure and reinstall? can we do something here
... View more
02-19-2016
03:02 PM
2 Kudos
I get the following screen while installing because of the error The installation has not completed because of the zookeeper failing to pass the smoke test thus halting the processes further and giving 100% complete and with warning error on the installation page I have attached the zookeeper.txt file which shows the error.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
02-17-2016
10:13 AM
I too agree! Thank you @Rahul Pathak
... View more
02-17-2016
08:03 AM
@Neeraj Sabharwal @Artem Ervits I hope this thread helps understand the error
... View more
02-17-2016
07:27 AM
This is what the Warning/Failure links look like: Node 1 Node 2:
Node 3:
... View more