Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Configure hadoop-client tools to access hdfs from external computer

avatar
Expert Contributor

I would like to be able to perform hdfs commands from a computer that is NOT actually part of the cloudera cluster.

 

For example, performing simple put/get operations or

hdfs dfs -ls /my/dir

I have installed the correct binaries, I think.  I found from CM that I was using CDH4.7.1.  So, I installed (sudo apt-get install hadoop-client) the binaries from here.

 

If I run:

hdfs dfs -ls /

I get:

Error: JAVA_HOME is not set and could not be found.

 

I feel that this might just be the beginning of a long tinkering and configuring process and I unfortunately know nothing about java.  I do, however, know the IPs of the namenodes on my cluster and have access to all admin rights from beginning to end.  

 

Can someone help me get things configured?

 

P.S. In case it wasn't clear.  I can perform all desired functionality on nodes that are part of the cluster.  I just want to do something similar from my development environment.

1 ACCEPTED SOLUTION

avatar
Champion
For what it is worth, I just did this and it worked.

1. set up the cdh 5 repo
2. installed hadoop-client with my package manager
3. updated the configs manually (scp or cm api)
4. ???
5. profit

View solution in original post

14 REPLIES 14

avatar
Expert Contributor

Yes, thank you.  That did the trick.

 

So, basically, my procedure was as follows.

 

  • Add the cloudera repository containing the hadoop binaries
sudo vim /etc/apt/sources.list.d/cloudera-manager.list
  • Install the binaries.  I used hadoop-client package and that was enough. 
sudo apt-get install hadoop-client
  • Install Java.  This made the error about $JAVA_HOME go away even though I still don't have that env set.
sudo apt-get install openjdk-6-jre-headless
  • Copy the config files from a functioning cluster node
sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf   

 

avatar
New Contributor

Hi, referring to the last step, do you encounter the Permission denied error when doing scp? 

sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf 

 

I managed to copy all the files inside /conf except for container-executor.cfg which shows the message in terminal below:

scp: /etc/hadoop/conf/container-executor.cfg: Permission denied

avatar
Champion

@mbigelow  " It is part of the magic that happens when you install a gateway role "

 

Are you referring to the  NFS gateway role ?

avatar
Champion
@csguna No, the YARN gateway, HDFS gateway, Hive gateway, etc. Each of these will install the binaries, libraries, set env vars, and client configuration files for its service.

avatar
Champion

@mbigelow  I got confused . THanks for clarifying it .