Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

how to import data

avatar
Explorer

Hi Guys, how are you? Hope so. I have a question, see if you can help me. Everything I create through the web interface only appears in the web interface, if I create a folder in / tmp for example, via terminal, it is not available in the Ambari interface and vice-versa. Even setting groups and permissions ... I'm using IBM's Hadoop service.

78618-selection-043.png

1 ACCEPTED SOLUTION

avatar
@Hugo Cosme

Not sure how you are doing the "ls" in terminal. Ambari interface list and the list from "hdfs dfs -ls /tmp/" command terminal should match.

View solution in original post

4 REPLIES 4

avatar
@Hugo Cosme

Not sure how you are doing the "ls" in terminal. Ambari interface list and the list from "hdfs dfs -ls /tmp/" command terminal should match.

avatar
Explorer

Hey, it worked out right .. thanks for the tip! Now another point, how do I access this directory? I tried the "hdfs dfs -cd / tmp /" command, but it did not work! I wanted to get into that directory and copy the data there with "wget" and then unzip them with a "tar -xzvf"

avatar

This is not how HDFS works 🙂 you will have to configure NFS to be able to cd and other stuff (https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_hdfs-nfs-gateway-user-guide/content/hdfs-nfs-gateway-user-guide.html)

Alternatively you can "wget" the files to hdfs client machine and then copy the files to hdfs using "hdfs dfs -put <local-source> <hdfs destination>"

avatar
Explorer

Hi, how are you?
I even noticed that things do not work out like I imagined, lol! I found the documentation after your first tip, thanks for this.
My problem now, is to map my IBM CloudStorage to my Hadoop, this is making me confused, but I'm evolving!