Created on 02-20-2015 10:31 AM - edited 09-16-2022 02:22 AM
Hi Team facing a very strange issue with cloudera latest installation. I am able to view the HDFS directory from web interface, but when i run a simple hadoop fs -ls in putty shell , it says -bash: hadoop: command not found. while i can see all HDFS files from web interface. Can you please help.
[root@hadoop-vm3 log]# hadoop fs -ls /
-bash: hadoop: command not found
Version: Cloudera Express 5.3.1 (#191 built by jenkins on 20150123-2020 git: b0377087cf605a686591e659eb14078923bc3c83)
Server Time: Feb 20, 2015 1:29:11 PM, Eastern Standard Time (EST)
Copyright © 2011-2014 Cloudera, Inc. All rights reserved.
Hadoop and the Hadoop elephant logo are trademarks of the Apache Software Foundation.
drwxrwxrwt | admin | hive | 0 B | 0 | 0 B | .hive-staging_hive_2015-02-20_04-40-09_720_8287848305105515146-1 |
drwxrwxrwt | root | hive | 0 B | 0 | 0 B | ttime=2015-02-20 |
drwxrwxrwt | root | hive | 0 B | 0 | 0 B |
|
Created 02-22-2015 02:18 AM
Thanks alot Gautam , its working fine now after restart , I can access hadoop file system @namenode machine now from command line . many many thanks for your help here !!!
By the way shall I run this command on data node as well ? i logged in to one of data node and its not able to recognize hadoop there
[root@hadoopvm1 ~]# hadoop fs -ls /
-bash: hadoop: command not found
service cloudera-scm-agent restart ?
Created 03-11-2015 02:35 AM
HI tarek
you can try running this , this will fix the issue.
service cloudera-scm-agent restart
Created 03-11-2015 03:19 AM
i tried it but it doesn't work , but i got this fixed by copying all files in path /var/lib/alternatives from a working VM to the VM that has the problem , then restarted the agent and got it fixed , thanks so much
Created 03-19-2015 09:14 AM
Hi
I have CDH 5.3.1 installed on my machine.I am facing the same issue.I have followed all the above steps for deleting empty link,setting up environment variables etc.I think the problem is with incorret path to environment variables.
when i run /opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/hadoop status
the o/p is
/opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/....lib/hadoop/bin/hadoop:line 138:/usr/lib/jvm/jdk1.8.0_40: Is a directory
/opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin/....lib/hadoop/bin/hadoop:line 138:exec: /usr/lib/jvm/jdk1.8.0_40:cannot execute:Is a directory
It might be a silly mistake with paths setting.:(
Created 03-20-2015 02:51 AM
Hi I am able to reslove some part of the error.I can execute all hadoop,hive,impala and hbase commands from
/opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/bin.how am i suppose to link these directories in etc/alternatives to create symlinks?what is the use of these symlinks? how can i run the command directly from /home/cloudera?
In the CM UI everything looks good but when is start flume i get and error "Failed to start agent because dependencies were not found in classpath".I think this error is related to classpath configurations.Are symlinks involved with classpath .?
Created on 01-26-2017 07:39 PM - edited 01-26-2017 08:32 PM
I was having same issue and was getting same error but when I was running any command through directory where I have installed CDH, I was able to run all the commands - hadoop, hdfs, spark-shell ect.
e.g. if your CHD installation location is - /dat/anlt1/cld/cloudera/CDH-5.8.3-1.cdh5.8.3.p0.2/bin
you can test -
$ cd /dat/anlt1/cld/cloudera/CDH-5.8.3-1.cdh5.8.3.p0.2/bin
[root@xyz bin]# ./hadoop and if it work then you need to set up environment variable in your Unix master server
For RHEL -
[root@xyz~]# echo "$PATH" /usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
[root@xyz~]# export PATH=$PATH:/path/to/CHD_Installation_bin_path
for me it's - /dat/anlt1/cld/cloudera/CDH-5.8.3-1.cdh5.8.3.p0.2/bin
[root@xyz~]# echo "$PATH"
/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/dat/anlt1/cld/cloudera/CDH-5.8.3-1.cdh5.8.3.p0.2/bin
to make permanent change -
$ echo "export PATH=$PATH:/dat/anlt1/cld/cloudera/CDH-5.8.3-1.cdh5.8.3.p0.2/bin" >> /etc/profile
after that restart(reboot) your server.