Member since
04-27-2016
61
Posts
61
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3845 | 09-19-2016 05:42 PM | |
1173 | 06-11-2016 06:41 AM | |
3214 | 06-10-2016 05:17 PM |
05-31-2018
06:16 PM
While trying to start Namenode from Ambari, its failing with the below error: raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.0.0.0-1371/hadoop/bin/hdfs --config /usr/hdp/3.0.0.0-1371/hadoop/conf --daemon start namenode'' returned 1. WARNING: HADOOP_NAMENODE_OPTS has been replaced by HDFS_NAMENODE_OPTS. Using value of HADOOP_NAMENODE_OPTS.
ERROR: Cannot set priority of namenode process 355
Does anybody know a solution Thanks in advance
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
01-23-2017
06:36 PM
1 Kudo
Looking for help with the below support related questions 1. What versions of the software do you currently provide support for? 2. What was the release data of the oldest version currently supported?
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
12-21-2016
07:16 PM
I went back to HDP 2.5.0 and no errors now. Looks like 2.5.3 has this start service issue. Thanks for the help
... View more
12-20-2016
10:45 PM
@Michael Young Thanks. I went ahead and did a 'restart all' on all those services that failed to start. It worked for some of them .However for few others like history server, hive server2, Namenode etc., i am still getting the following error in the log: Could you help me there, Thanks
History Server:
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --data-binary @/usr/hdp/2.5.3.0-37/hadoop/mapreduce.tar.gz 'http://ganne-test0.field.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.3.0-37/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444' 1>/tmp/tmpCBuCHv 2>/tmp/tmp76U3W6' returned 52. curl: (52) Empty reply from server
100
Hiveserver 2:
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X GET 'http://ganne-test0.field.hortonworks.com:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs' 1>/tmp/tmpNYUL1Z 2>/tmp/tmpij3JWm' returned 7. curl: (7) Failed connect to ganne-test0.field.hortonworks.com:50070; Connection refused
000
... View more
12-20-2016
10:07 PM
1 Kudo
Repo Description The DiP API has been tested on below mentioned HDP 2.4 components: Apache Hadoop 2.7.1.2.4 Apache Kafka 0.9.0.2.4 Apache Apex 3.4.0 Apache Hbase 1.1.2.2.4 Apache Hive 1.2.1.2.4 Apache Zeppelin 0.6.0.2.4 Apache Tomcat Server 8.0 Apache Phoenix 4.4.0.2.4 Apache Maven Java 1.7 or later Repo Info Github Repo URL https://github.com/XavientInformationSystems/Data-Ingestion-Platform/tree/master/dataingest-apex Github account name XavientInformationSystems/Data-Ingestion-Platform/tree/master Repo name dataingest-apex
... View more
Labels:
12-20-2016
10:02 PM
3 Kudos
While trying to install HDP 2.5.3 on a 4 node cluster via Ambari Wizard, I passed all the steps and got till this point of 'install, start and test' but it gives this warning that many services failed to start. Please see the attachments for pictures. Did any one face this issue?
... View more
Labels:
12-20-2016
09:43 PM
1 Kudo
Repo Description You can now deploy DataTorrent RTS within the Ambari stack. Such a deployment ensures simplified management of the DataTorrent RTS setup. Note: DataTorrent RTS, powered by Apache Apex, provides a high-performing, fault-tolerant, scalable, easy-to-use data processing platform for batch and streaming workloads. It includes advanced management, monitoring, development, visualization, data ingestion, and distribution features. Repo Info Github Repo URL https://github.com/DataTorrent/ambari-datatorrent-service Github account name DataTorrent Repo name ambari-datatorrent-service
... View more
Labels:
11-14-2016
03:42 PM
Appreciate your input . Yes the issue was the key tab and principal. Had to kinit with the specific principal instead of the default one and the error was gone.thanks
... View more
11-11-2016
07:31 PM
Running into the errors in the pictures and log file , while trying to submit a topology on a kerberised cluster. Nimbus is running on lake2.field.hortonworks.com. Can some one shed some light here, who are kerberos gurus. Appreciate any help!
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Storm