Support Questions
Find answers, ask questions, and share your expertise

Is there a way to unit test an inputformat/recordreader? MRUnit doesn't have anything obvious for testing an inputformat.

Highlighted

Is there a way to unit test an inputformat/recordreader? MRUnit doesn't have anything obvious for testing an inputformat.

New Contributor

We are exploring other options outside of hive for processing some truly huge xml records. We keep running into OOM errors all over the place right now even with 16GB containers currently. Currently we are storing our ginormous xml inside of avro sequence files in a hive table and we have a MR job using the hcatinputformat to read the data. Performance is subpar at best, since our container sizes are so large we can only run 8 per node, and even still we OOM enough to warrant increasing more.

We have idea's to help the writing out to ORC for the destination data (which will still be in hive) so that is less likely to error or OOM, so now our focus is on the input side and reducing memory allocation there. Our current thought pattern is to attempt to go to a streaming sax style parse of the xml so that the entire record is never resident in memory from the input side. The actual object result that comes out of it will be, and the serialization outbound will remain. Looking through the various inputformats in hadoop I think that the way things are written there are at least 2 copies of the record in ram long before we see it in the mapper portion and make our copy and subsequent serialization copy. Sadly this copying I've traced all the way down into the sequence file readers themselves and I don't see any good way to remove it and still use a sequence file.

So my current thought is to move all the work down into a custom inputformat that is the thinnest possible layer around an inputstream. Looking into the fileinputformat classes most of them look to be unusable as well as they all typically use the linerecordreader underneath them and that does a full byte[] copy of the entire record.

With that stated, is there a way to unit test an inputformat/recordreader? MRUnit doesn't have anything obvious for testing an inputformat.

2 REPLIES 2
Highlighted

Re: Is there a way to unit test an inputformat/recordreader? MRUnit doesn't have anything obvious for testing an inputformat.

Contributor

Just some thoughts here, before writing your own custom input format. Seems like your XML file is incredibly large and you want to prevent the entire XML from being loaded into memory. StAX is definitely one way but you loose parallelism. if you want to preprocess your XML, try:

  1. looking into Mahout's XMLInputFormat (https://dzone.com/articles/hadoop-practice),
  2. Or you could try using PIG's XML loader (http://hadoopgeek.com/apache-pig-xml-parsing-xpath/).
  3. Better yet, you could also look at HDF/Nifi to process XML (https://community.hortonworks.com/articles/25720/parsing-xml-logs-with-nifi-part-1-of-3.html)

HTH.

Re: Is there a way to unit test an inputformat/recordreader? MRUnit doesn't have anything obvious for testing an inputformat.

New Contributor

Thanks @Ed Gleeck...We tried Mahout but unfortunately wasn't the solution we were looking for. Didn't think about Pig or HDF/Nifi, thanks for that.