Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Coping files from Remote server to HDFS

avatar
Explorer

Hi,

I have a remote server and Kerberos authenticated Hadoop environment.

I want to copy files from Remote server to HDFS for processing using Spark. Please advise efficient approach/HDFS command to copy files from remote server to HDFS. Any example will be helpful. We are bound by not to use flume or Nifi.

Please note Kerberos is installed on Remote server.

1 ACCEPTED SOLUTION

avatar
Contributor

@Vinit Pandey

I suggest on your HDFS server you start a process (using shellscript for example) to execute kinit and after that, get these remote files using sftp or scp example

# scp user@remoteserver:/remotepath/files localpath/ 

and

# hdfs dfs -put localpath/files /hdfspath

Note: To automate this process you can create a private/public ssh between these servers and create a crontab entry.

View solution in original post

1 REPLY 1

avatar
Contributor

@Vinit Pandey

I suggest on your HDFS server you start a process (using shellscript for example) to execute kinit and after that, get these remote files using sftp or scp example

# scp user@remoteserver:/remotepath/files localpath/ 

and

# hdfs dfs -put localpath/files /hdfspath

Note: To automate this process you can create a private/public ssh between these servers and create a crontab entry.