Support Questions

Find answers, ask questions, and share your expertise

Script must be run where Hive is installed

avatar
Contributor

I'm trying to build and generate data using the following repository:

https://github.com/cartershanklin/hive-testbench

But when ever I tried to run ./tpch-setup.sh 2 as example I got the following message:

Script must be run where Hive is installed

Anyone one can help me with this?

Thanks,

Mohammed

6 REPLIES 6

avatar
Master Mentor

@Mohammed Syam

As the Script is looking out for the "hive" command line utility which is usually present on either HiveServer host or On the Machine where you have installed Hive Client .... hence it is complaining that you do not have Hive Installed on this host where you are trying to run this script:

https://github.com/cartershanklin/hive-testbench/blob/master/tpch-setup.sh#L20-L24

which hive > /dev/null 2>&1
        if [ $? -ne 0 ]; then
        	echo "Script must be run where Hive is installed"
        	exit 1
        fi

.

You can simply find out why it is failing by just running hive command on the host where this script is running (you will find that it is not installed on your host) OR the correct PATH variable is not set to point to hive binary.

# which hive

.

avatar
Contributor

@Jay Kumar SenSharma When running "which hive" I got the following:

usr/bin/which: no hive in (/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/root/bin)

Not sure what I should do next? please notice that I'm using HDP sandbox.

Thanks for help.

avatar
Rising Star

@Mohammed Syam You can check the host on which hiveserver2 is running by going to Ambari UI and clicking on HiverServer2 under Hive service section. Login to the host and run the script. You can confirm if hive is running by using the command 'which hive' as mentioned above before running the script.

avatar
Contributor

@dthakkar By login to host you mean the VMWare or the sandbox itself? I am already doing this.

I'm using the terminal inside the sandbox to run the script, also I ran the cmmand "which hive" and got the following:

usr/bin/which: no hive in (/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/root/bin)

avatar
Master Mentor

@Mohammed Syam

The following output of "which hive" command execution indicates that the machine where you are trying to run the Script "" does not meet the prerequisite of having the Hive Clients installed already.

# which hive
usr/bin/which: no hive in (/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/root/bin)

.

Or it might be possible that you have logged in to a incorrect Shell or User account and the PATH variable is not set properly there.

So i will suggest you to do the following to run the script once:

1. Login to the HDP Sandbox using SSH (ONLY on port 2222) as following: (if you are using putty then type 2222 where the SSH port needs to be mentioned)

# ssh root@localhost -p 2222
Enter Password:    hadoop

2. Once you are able to login to Sandbox using SSH on port 2222 then try running your script once.

# which hive
# echo $PATH
# cd  /PATH/TO/Project
# ./tpch-setup.sh

.

Please share the output of above commands.

.

avatar
Contributor

@Jay Kumar SenSharma I followed your instructions and following what I got: I'm more a UI guy so I am not very familiar with command based systems, so the problem now when I connect to port 2222 I can't see the folder that contains the project , would you please help on this, appricate the help,

[root@sandbox ~]# pwd
/root
[root@sandbox ~]# ls
anaconda-ks.cfg Desktop initial-setup-ks.cfg Public testbench
apache-maven-3.0.5 Documents Music start_scripts Videos
apache-maven-3.0.5-bin.tar.gz Downloads Pictures Templates
[root@sandbox ~]# ssh root@localhost -p 2222
root@localhost's password:
Last login: Fri Jan 12 11:17:06 2018 from 172.17.0.1
[root@sandbox ~]# pwd
/root
[root@sandbox ~]# ls
anaconda-ks.cfg build.out install.log sandbox.info start_hbase.sh
blueprint.json hdp install.log.syslog start_ambari.sh
[root@sandbox ~]# which hive
/usr/bin/hive

[root@sandbox ~]# echo $PATH
/usr/lib64/qt-3.3/bin:/usr/lib/jvm/java/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/hdp/current/falcon-client/bin:/usr/hdp/current/hadoop-mapreduce-historyserver/bin:/usr/hdp/current/oozie-client/bin:/usr/hdp/current/falcon-server/bin:/usr/hdp/current/hadoop-yarn-client/bin:/usr/hdp/current/oozie-server/bin:/usr/hdp/current/flume-client/bin:/usr/hdp/current/hadoop-yarn-nodemanager/bin:/usr/hdp/current/pig-client/bin:/usr/hdp/current/flume-server/bin:/usr/hdp/current/hadoop-yarn-resourcemanager/bin:/usr/hdp/current/slider-client/bin:/usr/hdp/current/hadoop-client/bin:/usr/hdp/current/hadoop-yarn-timelineserver/bin:/usr/hdp/current/sqoop-client/bin:/usr/hdp/current/hadoop-hdfs-client/bin:/usr/hdp/current/hbase-client/bin:/usr/hdp/current/sqoop-server/bin:/usr/hdp/current/hadoop-hdfs-datanode/bin:/usr/hdp/current/hbase-master/bin:/usr/hdp/current/storm-client/bin:/usr/hdp/current/hadoop-hdfs-journalnode/bin:/usr/hdp/current/hbase-regionserver/bin:/usr/hdp/current/storm-nimbus/bin:/usr/hdp/current/hadoop-hdfs-namenode/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/storm-supervisor/bin:/usr/hdp/current/hadoop-hdfs-nfs3/bin:/usr/hdp/current/hive-metastore/bin:/usr/hdp/current/zookeeper-client/bin:/usr/hdp/current/hadoop-hdfs-portmap/bin:/usr/hdp/current/hive-server2/bin:/usr/hdp/current/zookeeper-server/bin:/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin:/usr/hdp/current/hive-webhcat/bin:/usr/hdp/current/hadoop-mapreduce-client/bin:/usr/hdp/current/knox-server/bin:/usr/hdp/current/hadoop-client/sbin:/usr/hdp/current/hadoop-hdfs-nfs3/sbin:/usr/hdp/current/hadoop-yarn-client/sbin:/usr/hdp/current/hadoop-hdfs-client/sbin:/usr/hdp/current/hadoop-hdfs-portmap/sbin:/usr/hdp/current/hadoop-yarn-nodemanager/sbin:/usr/hdp/current/hadoop-hdfs-datanode/sbin:/usr/hdp/current/hadoop-hdfs-secondarynamenode/sbin:/usr/hdp/current/hadoop-yarn-resourcemanager/sbin:/usr/hdp/current/hadoop-hdfs-journalnode/sbin:/usr/hdp/current/hadoop-mapreduce-client/sbin:/usr/hdp/current/hadoop-yarn-timelineserver/sbin:/usr/hdp/current/hadoop-hdfs-namenode/sbin:/usr/hdp/current/hadoop-mapreduce-historyserver/sbin:/usr/hdp/current/hive-webhcat/sbin:/root/bin