Created 10-23-2016 07:51 AM
Created 10-23-2016 07:55 AM
You can have multiple options to do it -
- Copy the zip file on gateway node on local Filesystem path and use "hadoop fs -put" or "hadoop fs -copyFromLocal" command to copy file from local FS to HDFS.
- You can have NFS gateway configured for hadoop hdfs and mount the hdfs to local FS path, where you can directly copy the files using scp or copy command.
Created 10-23-2016 07:55 AM
You can have multiple options to do it -
- Copy the zip file on gateway node on local Filesystem path and use "hadoop fs -put" or "hadoop fs -copyFromLocal" command to copy file from local FS to HDFS.
- You can have NFS gateway configured for hadoop hdfs and mount the hdfs to local FS path, where you can directly copy the files using scp or copy command.
Created 10-24-2016 02:55 AM
Hi,
Thanks for your response.
Where to right this command because we didn't have command line interface.
We are not using any NFS.
My file is in local.
Regards,
Jayachandra Babu
Created 10-24-2016 10:24 AM
The command should be use on cli. If you are not using cli then you might go for something like FTP server -
https://sites.google.com/a/iponweb.net/hadoop/Home/hdfs-over-ftp
If you want to continuously fetch data from remote system and dump in hadoop the you can also look for FLUME/Kafka.