I am installing Hadoop using Ambari .
I got error during spark client install .
here is the error
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install 'spark_2_2_*'' returned 1. Error: Nothing to do 2016-09-13 16:35:38,220 - Could not determine HDP version for component spark-client by calling '/usr/bin/hdp-select status spark-client > /tmp/tmpJxg9F3'. Return Code: 1, Output: ERROR: Invalid package - spark-client Packages: accumulo-client accumulo-gc accumulo-master accumulo-monitor accumulo-tablet accumulo-tracer falcon-client falcon-server flume-server hadoop-client hadoop-hdfs-datanode hadoop-hdfs-journalnode hadoop-hdfs-namenode hadoop-hdfs-nfs3 hadoop-hdfs-portmap hadoop-hdfs-secondarynamenode hadoop-mapreduce-historyserver hadoop-yarn-nodemanager hadoop-yarn-resourcemanager hadoop-yarn-timelineserver hbase-client hbase-master hbase-regionserver hive-metastore hive-server2 hive-webhcat kafka-broker knox-server mahout-client oozie-client oozie-server phoenix-client ranger-admin ranger-usersync slider-client sqoop-client sqoop-server storm-client storm-nimbus storm-slider-client storm-supervisor zookeeper-client zookeeper-server Aliases: accumulo-server all client hadoop-hdfs-server hadoop-mapreduce-server hadoop-yarn-server hive-server .
Try this: (Please replace the version 2.2.x.x.x with your version number there)
rm -r /usr/hdp/current/spark-client ln -s /usr/hdp/2.2.x.x.x/spark /usr/hdp/current/spark-client
So it should look like:
ls -l /usr/hdp/current/spark-client lrwxrwxrwx. 1 root root 26 Sep 14 13:04 /usr/hdp/current/spark-client -> /usr/hdp/2.2.x.x.x/spark
That error message indicates the spark package is not available.
Are you adding Spark to an existing installation, or setting up a new cluster? What version of HDP are you using?
Please try running /usr/bin/hdp-select status spark-client and let me know what it returns.
Are you adding Spark or setting up a new cluster? If possible can you complete install without Spark and add individually? Also, what version of HDP are you installing/using?
Thanks Ben for reply .
New setup .By the way i found the issue .sprak will work with HDP 2.4 .I excluded the spark and install rest of the services.Installation completed successfully
Now when ambari is trying to start datanode . it failed with error .
Fail: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start datanode'' returned 1. -bash: line 0: ulimit: core file size: cannot modify limit: Operation not permitted starting datanode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-datanode-pppdc9prdc96.out
As this issue at the surface doesn't seem to be related, you may want to close this question and ask a new question as you will get more help.
In the meantime, to get started on the troubleshooting please run the following and post the results.
grep -v ^# /etc/security/limits.conf
@Jaspreet Singh Can you please provide below info:
1. ls -ltr /usr/hdp/ 2. ls -ltr /usr/hdp/current/ 3. yum info spark-client 4. Run below command manually on host and upload the output: 4.1 yum clean all 4.2 /usr/bin/yum -d 0 -e 0 -y install 'spark_2_2_*'
Please let me know how it goes.
[root@pppdc9prdc8u hdp]# yum info spark-client Loaded plugins: security HDP-2.2 | 2.9 kB 00:00 HDP-UTILS-184.108.40.206 | 2.9 kB 00:00 Updates-ambari-2.0.0 | 2.9 kB 00:00 Error: No matching Packages to list [root@pppdc9prdc8u hdp]# yum clean all Loaded plugins: security Cleaning repos: HDP-2.2 HDP-UTILS-220.127.116.11 Updates-ambari-2.0.0 rhel_current saltstack Cleaning up Everything [root@pppdc9prdc8u hdp]# /usr/bin/yum -d 0 -e 0 -y install 'spark_2_2_*'
Thanks jk for your quick response . Really appropriate !
still the same issue
this is what i have done
[root@pppdc9prdc8u ~]# rm -r /usr/hdp/current/spark-client rm: cannot remove `/usr/hdp/current/spark-client': No such file or directory [root@pppdc9prdc8u ~]# cd /usr/hdp/ [root@pppdc9prdc8u hdp]# ls 18.104.22.168-2041 current [root@pppdc9prdc8u hdp]# ln -s /usr/hdp/22.214.171.124-2041/spark /usr/hdp/current/spark-client [root@pppdc9prdc8u hdp]# ls -l /usr/hdp/current/spark-client lrwxrwxrwx 1 root root 27 Sep 14 11:26 /usr/hdp/current/spark-client -> /usr/hdp/126.96.36.199-2041/spark