Created 04-11-2017 02:34 PM
Hello,
I was following this community post to install Hadoop client without yum. But, with latest hdp repo 2.5.3.0 I am getting the below exception. I want to install HDFS client on our HDF cluster to access HDP cluster hdfs.
Any suggestions on approaches or how to do it?
I installed the repo using below url's:
Install command -->
rpm -Uvh hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-hdfs-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-client-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-mapreduce-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-libhdfs-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-yarn-2.7.3.2.5.3.0-37.el6.x86_64.rpm zookeeper_2_5_3_0_37-3.4.6.2.5.3.0-37.el6.noarch.rpm bigtop-jsvc-1.0.15-37.el6.x86_64.rpm
error:
Failed dependencies:
ranger_2_5_3_0_37-hdfs-plugin is needed by hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64
ranger_2_5_3_0_37-yarn-plugin is needed by hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64
hdp-select >= 2.5.3.0-37 is needed by hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64
spark_2_5_3_0_37-yarn-shuffle is needed by hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64
spark2_2_5_3_0_37-yarn-shuffle is needed by hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64
nc is needed by hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64
hdp-select >= 2.5.3.0-37 is needed by zookeeper_2_5_3_0_37-3.4.6.2.5.3.0-37.el6.noarch
Thank you in advance!
Created 04-12-2017 03:30 PM
The above approach will not work as it requires 'hdp-select'. According to HW, HDP is not allowed on a cluster where HDF is installed and vice-versa. An adequate solution is, to install Apache Hadoop (version same as your HDP).
Steps I followed:
I hope this will help somebody in future!
Created 04-12-2017 12:51 PM
You can download tar file from given link and you will get all rpm dependencies.
http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.0.0/HDP-2.5.0.0-centos7-rpm.tar.gz
Download dependancies using from below links: http://public-repo-
ftp://bo.mirror.garr.it/1/slc/centos/7.1.1503/os/x86_64/Packages/nmap-ncat-6.40-4.el7.x86_64.rpm
Now you can install using below command:
rpm -Uvh hadoop_2_5_3_0_37-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-hdfs-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-client-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-mapreduce-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-libhdfs-2.7.3.2.5.3.0-37.el6.x86_64.rpm hadoop_2_5_3_0_37-yarn-2.7.3.2.5.3.0-37.el6.x86_64.rpm zookeeper_2_5_3_0_37-3.4.6.2.5.3.0-37.el6.noarch.rpm bigtop-jsvc-1.0.15-37.el6.x86_64.rpm ranger_2_5_3_0_37-hdfs-plugin-0.6.0.2.5.3.0-37.el6.x86_64.rpm ranger_2_5_3_0_37-yarn-plugin-0.6.0.2.5.3.0-37.el6.x86_64.rpmhdp-select-2.5.3.0-37.el6.noarch.rpm spark_2_5_3_0_37-yarn-shuffle-1.6.2.2.5.3.0-37.el6.noarch.rpm spark2_2_5_3_0_37-yarn-shuffle-2.0.0.2.5.3.0-37.el6.noarch.rpm nmap-ncat-6.40-4.el7.x86_64.rpm
OR
Download apache hadoop-2.8.0 tar file and make all configurations as per your Cluster. so that you can access your hadoop cluster from this client.
https://archive.apache.org/dist/hadoop/core/hadoop-2.8.0/hadoop-2.8.0.tar.gz
Created 04-12-2017 03:07 PM
Thank you @arjun more. But, this is not working on a HDF cluster. I took another approach and install Apache client manually.
Created 04-12-2017 03:30 PM
The above approach will not work as it requires 'hdp-select'. According to HW, HDP is not allowed on a cluster where HDF is installed and vice-versa. An adequate solution is, to install Apache Hadoop (version same as your HDP).
Steps I followed:
I hope this will help somebody in future!
Created 04-12-2017 06:49 PM
Yes @Shashant Panwar, but this both were worked for me and i already mention this manual apache hadoop client installation approach in above comment.
Created 04-12-2017 07:03 PM
@arjun more I did not notice the last line. Yes, I took that approach.