- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
how to import data
- Labels:
-
Apache Hadoop
Created on ‎07-02-2018 02:24 PM - edited ‎08-18-2019 03:02 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Guys, how are you? Hope so. I have a question, see if you can help me. Everything I create through the web interface only appears in the web interface, if I create a folder in / tmp for example, via terminal, it is not available in the Ambari interface and vice-versa. Even setting groups and permissions ... I'm using IBM's Hadoop service.
Created ‎07-02-2018 02:29 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Not sure how you are doing the "ls" in terminal. Ambari interface list and the list from "hdfs dfs -ls /tmp/" command terminal should match.
Created ‎07-02-2018 02:29 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Not sure how you are doing the "ls" in terminal. Ambari interface list and the list from "hdfs dfs -ls /tmp/" command terminal should match.
Created ‎07-02-2018 02:47 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey, it worked out right .. thanks for the tip! Now another point, how do I access this directory? I tried the "hdfs dfs -cd / tmp /" command, but it did not work! I wanted to get into that directory and copy the data there with "wget" and then unzip them with a "tar -xzvf"
Created ‎07-02-2018 04:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This is not how HDFS works 🙂 you will have to configure NFS to be able to cd and other stuff (https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_hdfs-nfs-gateway-user-guide/content/hdfs-nfs-gateway-user-guide.html)
Alternatively you can "wget" the files to hdfs client machine and then copy the files to hdfs using "hdfs dfs -put <local-source> <hdfs destination>"
Created ‎07-02-2018 07:20 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, how are you?
I even noticed that things do not work out like I imagined, lol! I found the documentation after your first tip, thanks for this.
My problem now, is to map my IBM CloudStorage to my Hadoop, this is making me confused, but I'm evolving!
