- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
In Horotonworks sandbox 2.5 how to upload a zip file to HDFS
- Labels:
-
Apache Hadoop
Created 10-23-2016 07:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created 10-23-2016 07:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can have multiple options to do it -
- Copy the zip file on gateway node on local Filesystem path and use "hadoop fs -put" or "hadoop fs -copyFromLocal" command to copy file from local FS to HDFS.
- You can have NFS gateway configured for hadoop hdfs and mount the hdfs to local FS path, where you can directly copy the files using scp or copy command.
Created 10-23-2016 07:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can have multiple options to do it -
- Copy the zip file on gateway node on local Filesystem path and use "hadoop fs -put" or "hadoop fs -copyFromLocal" command to copy file from local FS to HDFS.
- You can have NFS gateway configured for hadoop hdfs and mount the hdfs to local FS path, where you can directly copy the files using scp or copy command.
Created 10-24-2016 02:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Thanks for your response.
Where to right this command because we didn't have command line interface.
We are not using any NFS.
My file is in local.
Regards,
Jayachandra Babu
Created 10-24-2016 10:24 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The command should be use on cli. If you are not using cli then you might go for something like FTP server -
https://sites.google.com/a/iponweb.net/hadoop/Home/hdfs-over-ftp
If you want to continuously fetch data from remote system and dump in hadoop the you can also look for FLUME/Kafka.