- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Hadoop sequence file EOFException
- Labels:
-
Apache Hadoop
-
HDFS
-
MapReduce
Created on 10-21-2016 04:00 AM - edited 09-16-2022 03:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello. I tryed to process image on hadoop and i using HVPI (https://github.com/xmpy/hvpi). It's open source. I using it to extract the frames from video but it not provide a output format to save the frames on HDFS. So i tryed to use the SequencefileOutputFormat to save the frames on HDFS. I readed a book and make some changes to the code work on hadoop 2.7.1. Apparently it's worked but when i tryed to recover the file in another mapreduce job i got EOFException. I thinked there is 3 possibily:
1- My SequenceFileOutputFormat is wrong and save a corrupted file;
2-My SequenceInputFormt is wrong;
3- The HVPI custom type is wrong.
Created 10-22-2016 09:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I founded the error. The HVPI custom type was wrong. I just put out.write(byteOutputStream.toByteArray()); on write method and now it's work.
Created 10-22-2016 08:40 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I founded the error. The HVPI custom type was wrong. It was in the write method.
Created 10-22-2016 09:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I founded the error. The HVPI custom type was wrong. I just put out.write(byteOutputStream.toByteArray()); on write method and now it's work.
