- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Configure hadoop-client tools to access hdfs from external computer
- Labels:
-
HDFS
Created ‎06-16-2017 12:33 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I would like to be able to perform hdfs commands from a computer that is NOT actually part of the cloudera cluster.
For example, performing simple put/get operations or
hdfs dfs -ls /my/dir
I have installed the correct binaries, I think. I found from CM that I was using CDH4.7.1. So, I installed (sudo apt-get install hadoop-client) the binaries from here.
If I run:
hdfs dfs -ls /
I get:
Error: JAVA_HOME is not set and could not be found.
I feel that this might just be the beginning of a long tinkering and configuring process and I unfortunately know nothing about java. I do, however, know the IPs of the namenodes on my cluster and have access to all admin rights from beginning to end.
Can someone help me get things configured?
P.S. In case it wasn't clear. I can perform all desired functionality on nodes that are part of the cluster. I just want to do something similar from my development environment.
Created ‎06-26-2017 02:22 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
1. set up the cdh 5 repo
2. installed hadoop-client with my package manager
3. updated the configs manually (scp or cm api)
4. ???
5. profit
Created ‎06-28-2017 08:27 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, thank you. That did the trick.
So, basically, my procedure was as follows.
- Add the cloudera repository containing the hadoop binaries
sudo vim /etc/apt/sources.list.d/cloudera-manager.list
- Install the binaries. I used hadoop-client package and that was enough.
sudo apt-get install hadoop-client
- Install Java. This made the error about $JAVA_HOME go away even though I still don't have that env set.
sudo apt-get install openjdk-6-jre-headless
- Copy the config files from a functioning cluster node
sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf
Created ‎02-05-2018 10:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, referring to the last step, do you encounter the Permission denied error when doing scp?
sudo scp user@cluster:/etc/hadoop/conf/* /etc/hadoop/conf
I managed to copy all the files inside /conf except for container-executor.cfg which shows the message in terminal below:
scp: /etc/hadoop/conf/container-executor.cfg: Permission denied
Created ‎06-28-2017 04:58 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@mbigelow " It is part of the magic that happens when you install a gateway role "
Are you referring to the NFS gateway role ?
Created ‎06-28-2017 07:53 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created ‎06-28-2017 09:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@mbigelow I got confused . THanks for clarifying it .

- « Previous
-
- 1
- 2
- Next »