Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Copy big files into HDFS

Copy big files into HDFS

New Contributor

What is the best approach to automaticly copying big data files (not log files!), from Uunix/Linux local filesystem into HDFS?

3 REPLIES 3

Re: Copy big files into HDFS

Master Guru
The question's lacking further parameters, such as if this is a regular, periodic operation, or a one-time one.

If it is a one-time one, as an assumption, then you can simply use the "hadoop fs -put source-files… destination-dir/" command.

Re: Copy big files into HDFS

New Contributor

I want to know, what is the best way to do it in regular, automatic maner???

Re: Copy big files into HDFS

Expert Contributor

Best is a bit nebulous, 

 

It's usually cleanest to change code to write the big files to hdfs directly.

http://hadoop.apache.org/docs/current/api/

 

Sounds like maybe this is a Integration problem where you don't have access to change code, in that case perhaps use some NFS trickery to write the files directly? 

http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-hdfs/HdfsNfsGateway.html