Reply
Champion
Posts: 744
Registered: ‎05-16-2016

Re: Configure hadoop-client tools to access hdfs from external computer

@mbigelow  " It is part of the magic that happens when you install a gateway role "

 

Are you referring to the  NFS gateway role ?

Posts: 642
Topics: 3
Kudos: 113
Solutions: 67
Registered: ‎08-16-2016

Re: Configure hadoop-client tools to access hdfs from external computer

@csguna No, the YARN gateway, HDFS gateway, Hive gateway, etc. Each of these will install the binaries, libraries, set env vars, and client configuration files for its service.
Explorer
Posts: 39
Registered: ‎04-13-2017

Re: Configure hadoop-client tools to access hdfs from external computer

Yes, thank you.  That did the trick.

 

So, basically, my procedure was as follows.

 

  • Add the cloudera repository containing the hadoop binaries
sudo vim /etc/apt/sources.list.d/cloudera-manager.list
  • Install the binaries.  I used hadoop-client package and that was enough. 
sudo apt-get install hadoop-client
  • Install Java.  This made the error about $JAVA_HOME go away even though I still don't have that env set.
sudo apt-get install openjdk-6-jre-headless
  • Copy the config files from a functioning cluster node
sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf   

 

Champion
Posts: 744
Registered: ‎05-16-2016

Re: Configure hadoop-client tools to access hdfs from external computer

@mbigelow  I got confused . THanks for clarifying it .

New Contributor
Posts: 2
Registered: ‎02-05-2018

Re: Configure hadoop-client tools to access hdfs from external computer

Hi, referring to the last step, do you encounter the Permission denied error when doing scp? 

sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf 

 

I managed to copy all the files inside /conf except for container-executor.cfg which shows the message in terminal below:

scp: /etc/hadoop/conf/container-executor.cfg: Permission denied
Announcements