Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Coping files from Remote server to HDFS

Solved Go to solution
Highlighted

Coping files from Remote server to HDFS

Explorer

Hi,

I have a remote server and Kerberos authenticated Hadoop environment.

I want to copy files from Remote server to HDFS for processing using Spark. Please advise efficient approach/HDFS command to copy files from remote server to HDFS. Any example will be helpful. We are bound by not to use flume or Nifi.

Please note Kerberos is installed on Remote server.

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Coping files from Remote server to HDFS

Explorer

@Vinit Pandey

I suggest on your HDFS server you start a process (using shellscript for example) to execute kinit and after that, get these remote files using sftp or scp example

# scp user@remoteserver:/remotepath/files localpath/ 

and

# hdfs dfs -put localpath/files /hdfspath

Note: To automate this process you can create a private/public ssh between these servers and create a crontab entry.

View solution in original post

1 REPLY 1
Highlighted

Re: Coping files from Remote server to HDFS

Explorer

@Vinit Pandey

I suggest on your HDFS server you start a process (using shellscript for example) to execute kinit and after that, get these remote files using sftp or scp example

# scp user@remoteserver:/remotepath/files localpath/ 

and

# hdfs dfs -put localpath/files /hdfspath

Note: To automate this process you can create a private/public ssh between these servers and create a crontab entry.

View solution in original post

Don't have an account?
Coming from Hortonworks? Activate your account here