Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Error loading a file into HDFS

avatar
New Member

We are trying to uplaod a file to HDFS that is 80GB. We are writing the file to /opt/Hadoop/tmp/.hdfs-nfs. It works fine with small files, but not with larger ones.

Does anyone have an answer as to where the file should write in temporarly prior to it moving into HDFS?

Is there some sort of other setting we need to consider?

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Jeremy Salazar you can upload to HDFS directly, if you're using NFS, it is not designed for larger files of your size, that's on the roadmap for NFSv4. I recommend to zip the file before you upload using -put command.

hdfs dfs -put file /user/username

View solution in original post

3 REPLIES 3

avatar
Master Mentor

@Jeremy Salazar you can upload to HDFS directly, if you're using NFS, it is not designed for larger files of your size, that's on the roadmap for NFSv4. I recommend to zip the file before you upload using -put command.

hdfs dfs -put file /user/username

avatar
New Member

Sorry I should have specified that we are using Ambari to upload into HDFS.

avatar
Master Mentor

@Jeremy Salazar you mean HDFS Ambari view? That won't work, it's too big of a file to upload via Ambari view. Consider using CLI.