Created on 02-20-2015 10:31 AM - edited 09-16-2022 02:22 AM
Hi Team facing a very strange issue with cloudera latest installation. I am able to view the HDFS directory from web interface, but when i run a simple hadoop fs -ls in putty shell , it says -bash: hadoop: command not found. while i can see all HDFS files from web interface. Can you please help.
[root@hadoop-vm3 log]# hadoop fs -ls /
-bash: hadoop: command not found
Version: Cloudera Express 5.3.1 (#191 built by jenkins on 20150123-2020 git: b0377087cf605a686591e659eb14078923bc3c83)
Server Time: Feb 20, 2015 1:29:11 PM, Eastern Standard Time (EST)
Copyright © 2011-2014 Cloudera, Inc. All rights reserved.
Hadoop and the Hadoop elephant logo are trademarks of the Apache Software Foundation.
drwxrwxrwt | admin | hive | 0 B | 0 | 0 B | .hive-staging_hive_2015-02-20_04-40-09_720_8287848305105515146-1 |
drwxrwxrwt | root | hive | 0 B | 0 | 0 B | ttime=2015-02-20 |
drwxrwxrwt | root | hive | 0 B | 0 | 0 B |
|
Created 02-22-2015 02:18 AM
Thanks alot Gautam , its working fine now after restart , I can access hadoop file system @namenode machine now from command line . many many thanks for your help here !!!
By the way shall I run this command on data node as well ? i logged in to one of data node and its not able to recognize hadoop there
[root@hadoopvm1 ~]# hadoop fs -ls /
-bash: hadoop: command not found
service cloudera-scm-agent restart ?
Created 02-20-2015 09:12 PM
Created 02-21-2015 12:18 PM
Hi Gautam
thanks alot for your reply , need some more info to fix this. can this be fixed from cloudera manager or i need to fix it on each machine by running some command
I checked /etc/alternatives/hadoop and it points to
hadoop -> /opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/hadoop
while in /var/lib/alternatives
also i found that hadoop is there . Now I am not sure what to delete here..?
So my question is how to restore this for all hadoop command like hadoop/hive etc....just to give you backgroud when this issue started, i got some configuration warning in cloudera manager and it was referring some outdated parcel etc, i just fixed that warning and this issue started. so is there any way i can fix this from cloudera manager. your help will be really appericiated ....
one interesting thing i noticed that when i gave the complete path , its able to list files
[root@hadoop-vm3 bin]# /opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/hadoop fs -ls /
Found 4 items
drwxr-xr-x - hbase hbase 0 2015-02-20 05:41 /hbase
drwxrwxr-x - solr solr 0 2015-02-18 04:50 /solr
drwxrwxrwt - hdfs supergroup 0 2015-02-20 06:44 /tmp
drwxr-xr-x - hdfs supergroup 0 2015-02-20 00:54 /user
==========================================================================
[root@hadoop-vm3 bin]# ls /var/lib/alternatives/
avro-tools hadoop-httpfs-conf hiveserver2 jre_openjdk mapred solr-conf sqoop-create-hive-table sqoop-version
beeline hadoop-kms-conf hive-webhcat-conf kite-dataset mta solrctl sqoop-eval statestored
catalogd hbase hue-conf libnssckbi.so.x86_64 oozie spark-conf sqoop-export whirr
cli_mt hbase-conf impala-conf links oozie-conf spark-executor sqoop-help yarn
cli_st hbase-indexer impalad llama pig spark-shell sqoop-import zookeeper-client
flume-ng hbase-solr-conf impala-shell llamaadmin pig-conf spark-submit sqoop-import-all-tables zookeeper-conf
flume-ng-conf hcat ip6tables.x86_64 llama-conf print sqoop sqoop-job zookeeper-server
hadoop hdfs iptables.x86_64 load_gen pyspark sqoop2 sqoop-list-databases zookeeper-server-cleanup
hadoop-0.20 hive java mahout sentry sqoop2-conf sqoop-list-tables zookeeper-server-initialize
hadoop-conf hive-conf jre_1.6.0 mahout-conf sentry-conf sqoop-codegen sqoop-merge
hadoop-fuse-dfs hive-hcatalog-conf jre_1.7.0 mail senty-conf sqoop-conf sqoop-metastore
==========================================================================
Created 02-21-2015 01:06 PM
Created 02-21-2015 06:24 PM
HI Gautam
please find the conetent as below, let me know next...
[root@hadoop-vm3 alternatives]# pwd
/var/lib/alternatives
[root@hadoop-vm3 alternatives]# cat hadoop
auto
/usr/bin/hadoop
/opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/hadoop
10
[root@hadoop-vm3 alternatives]#
Created 02-21-2015 10:13 PM
Created 02-22-2015 02:09 AM
yes file does exist and it has below content
[root@hadoop-vm3 ~]# cat /opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/hadoop
#!/bin/bash
# Reference: http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in
SOURCE="${BASH_SOURCE[0]}"
BIN_DIR="$( dirname "$SOURCE" )"
while [ -h "$SOURCE" ]
do
SOURCE="$(readlink "$SOURCE")"
[[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE"
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
done
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
LIB_DIR=$BIN_DIR/../lib
# Autodetect JAVA_HOME if not defined
. $LIB_DIR/bigtop-utils/bigtop-detect-javahome
export HADOOP_LIBEXEC_DIR=//$LIB_DIR/hadoop/libexec
exec $LIB_DIR/hadoop/bin/hadoop "$@"
[root@hadoop-vm3 ~]#
Created 02-22-2015 02:18 AM
Thanks alot Gautam , its working fine now after restart , I can access hadoop file system @namenode machine now from command line . many many thanks for your help here !!!
By the way shall I run this command on data node as well ? i logged in to one of data node and its not able to recognize hadoop there
[root@hadoopvm1 ~]# hadoop fs -ls /
-bash: hadoop: command not found
service cloudera-scm-agent restart ?
Created 02-22-2015 02:32 AM
Created 03-11-2015 01:04 AM
i am having same problem , but when i opened /var/lib/alternatives , i found hadoop file and most of other files empty ! with zero size