Reply
Highlighted
New Contributor
Posts: 5
Registered: ‎06-03-2016

File load strategy for large files (per file volume greater than 1 TB)

Hi,

What should be the strategy for loading files (Volume per file is more than 1 TB) in a reliable, fail-safe manner into HDFS?

 

Flume provides the fail-safety and reliability, but it is ideally meant for regularly-generated files into HDFS, my understanding is that it works fine for large no of file ingestion into HDFS ideally suitable for scenarios where data is generated in mini batches, but might not be efficient for single large file transfer into HDFS, please let me know if I am wrong here.

Also hadoop fs -put command cannot provide the fail safety, in case the transfer fails it won't restart the process

 

Regards,

Rajib