Support Questions

Find answers, ask questions, and share your expertise

Error loading a file into HDFS

avatar

We are trying to uplaod a file to HDFS that is 80GB. We are writing the file to /opt/Hadoop/tmp/.hdfs-nfs. It works fine with small files, but not with larger ones.

Does anyone have an answer as to where the file should write in temporarly prior to it moving into HDFS?

Is there some sort of other setting we need to consider?

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Jeremy Salazar you can upload to HDFS directly, if you're using NFS, it is not designed for larger files of your size, that's on the roadmap for NFSv4. I recommend to zip the file before you upload using -put command.

hdfs dfs -put file /user/username

View solution in original post

3 REPLIES 3

avatar
Master Mentor

@Jeremy Salazar you can upload to HDFS directly, if you're using NFS, it is not designed for larger files of your size, that's on the roadmap for NFSv4. I recommend to zip the file before you upload using -put command.

hdfs dfs -put file /user/username

avatar

Sorry I should have specified that we are using Ambari to upload into HDFS.

avatar
Master Mentor

@Jeremy Salazar you mean HDFS Ambari view? That won't work, it's too big of a file to upload via Ambari view. Consider using CLI.