- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Error loading a file into HDFS
- Labels:
-
Apache Hadoop
-
HDFS
Created on ‎03-02-2016 04:15 PM - edited ‎09-16-2022 03:06 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We are trying to uplaod a file to HDFS that is 80GB. We are writing the file to /opt/Hadoop/tmp/.hdfs-nfs. It works fine with small files, but not with larger ones.
Does anyone have an answer as to where the file should write in temporarly prior to it moving into HDFS?
Is there some sort of other setting we need to consider?
Created ‎03-02-2016 05:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Jeremy Salazar you can upload to HDFS directly, if you're using NFS, it is not designed for larger files of your size, that's on the roadmap for NFSv4. I recommend to zip the file before you upload using -put command.
hdfs dfs -put file /user/username
Created ‎03-02-2016 05:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Jeremy Salazar you can upload to HDFS directly, if you're using NFS, it is not designed for larger files of your size, that's on the roadmap for NFSv4. I recommend to zip the file before you upload using -put command.
hdfs dfs -put file /user/username
Created ‎03-02-2016 07:31 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sorry I should have specified that we are using Ambari to upload into HDFS.
Created ‎03-02-2016 07:34 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Jeremy Salazar you mean HDFS Ambari view? That won't work, it's too big of a file to upload via Ambari view. Consider using CLI.
