Created 06-16-2017 12:33 PM
I would like to be able to perform hdfs commands from a computer that is NOT actually part of the cloudera cluster.
For example, performing simple put/get operations or
hdfs dfs -ls /my/dir
I have installed the correct binaries, I think. I found from CM that I was using CDH4.7.1. So, I installed (sudo apt-get install hadoop-client) the binaries from here.
If I run:
hdfs dfs -ls /
I get:
Error: JAVA_HOME is not set and could not be found.
I feel that this might just be the beginning of a long tinkering and configuring process and I unfortunately know nothing about java. I do, however, know the IPs of the namenodes on my cluster and have access to all admin rights from beginning to end.
Can someone help me get things configured?
P.S. In case it wasn't clear. I can perform all desired functionality on nodes that are part of the cluster. I just want to do something similar from my development environment.
Created 06-26-2017 02:22 AM
Created 06-28-2017 08:27 PM
Yes, thank you. That did the trick.
So, basically, my procedure was as follows.
sudo vim /etc/apt/sources.list.d/cloudera-manager.list
sudo apt-get install hadoop-client
sudo apt-get install openjdk-6-jre-headless
sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf
Created 02-05-2018 10:07 PM
Hi, referring to the last step, do you encounter the Permission denied error when doing scp?
sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf
I managed to copy all the files inside /conf except for container-executor.cfg which shows the message in terminal below:
scp: /etc/hadoop/conf/container-executor.cfg: Permission denied
Created 06-28-2017 04:58 AM
@mbigelow " It is part of the magic that happens when you install a gateway role "
Are you referring to the NFS gateway role ?
Created 06-28-2017 07:53 PM
Created 06-28-2017 09:45 PM
@mbigelow I got confused . THanks for clarifying it .