Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to save Mapreduce's Reducer output without Key,Value pair ?


How to save Mapreduce's Reducer output without Key,Value pair ?


I am writing a Mapreduce program to process Dicom images.

The purpose of this Mapreduce program is to process the dicom image, extract metadata from it, index to solr and finally in Reducer phase it should save the raw image in hdfs.

I want to save the same file in HDFS as a reducer output

So I have achieved most of the functionality, but in reducer phase when storing the same file in hdfs it is not working.

I have tested the processed Dicom file with a dicom image viewer and it says the file is curropted and also the size of processed dicom file is slightly increase. **Ex.** Original Dicom size is 628Kb and when reducer save this file in hdfs it size changes to 630Kb.

I have tried solution from these links but none of them give the expected results.

Here is the code for Reading Dicom file as a single file (without splitting it).

    public class WholeFileInputFormat extends FileInputFormat<NullWritable, BytesWritable>{     

        protected boolean isSplitable(JobContext context, Path filename) {
            return false;

        public RecordReader<NullWritable, BytesWritable> createRecordReader(InputSplit split, TaskAttemptContext context)
                throws IOException, InterruptedException {
            WholeFileRecordReader reader = new WholeFileRecordReader();
            reader.initialize(split, context);
            return reader;

Custom RecordReader

    public class WholeFileRecordReader extends RecordReader<NullWritable, BytesWritable>{
        private FileSplit fileSplit;
        private Configuration conf;
        private BytesWritable value = new BytesWritable();
        private boolean processed = false;
        public void initialize(InputSplit split, TaskAttemptContext context) throws IOException, InterruptedException {        
            this.fileSplit = (FileSplit) split;
            this.conf = context.getConfiguration();        
        public boolean nextKeyValue() throws IOException, InterruptedException {
            if (!processed) {
                byte[] contents = new byte[(int) fileSplit.getLength()];
                System.out.println("Inside nextKeyvalue");
                Path file = fileSplit.getPath();
                FileSystem fs = file.getFileSystem(conf);
                FSDataInputStream in = null;
                try {
                    in =;
                    IOUtils.readFully(in, contents, 0, contents.length);
                    value.set(contents, 0, contents.length);
                } finally {
                    processed = true;
                    return true;
                return false;
        public void close() throws IOException {
        public NullWritable getCurrentKey() throws IOException, InterruptedException 
            return NullWritable.get();
        public BytesWritable getCurrentValue() throws IOException, InterruptedException {
            return value;
        public float getProgress() throws IOException, InterruptedException {
            return processed ? 1.0f : 0.0f;

Mapper Class

The mapper class working perfectly as per our need.

    public class MapClass{
        public static class Map extends Mapper<NullWritable, BytesWritable, Text, BytesWritable>{    
            protected void map(NullWritable key, BytesWritable value,
                    Mapper<NullWritable, BytesWritable, Text, BytesWritable>.Context context)
                    throws IOException, InterruptedException {
                InputStream in = new ByteArrayInputStream(value.getBytes());            
                ProcessDicom.metadata(in); // Process dicom image and extract metadata from it
                Text keyOut = getFileName(context);
                context.write(keyOut, value);
            private Text getFileName(Mapper<NullWritable, BytesWritable, Text, BytesWritable>.Context context)
                InputSplit spl = context.getInputSplit();
                Path filePath = ((FileSplit)spl).getPath();
                String fileName = filePath.getName();
                Text text = new Text(fileName);
                return text;
            protected void setup(Mapper<NullWritable, BytesWritable, Text, BytesWritable>.Context context)
                    throws IOException, InterruptedException {

Reducer Class

This is the reducer class.

    public class ReduceClass{    
        public static class Reduce extends Reducer<Text, BytesWritable, BytesWritable, BytesWritable>{
                protected void reduce(Text key, Iterable<BytesWritable> value,
                        Reducer<Text, BytesWritable, BytesWritable, BytesWritable>.Context context)
                        throws IOException, InterruptedException {
                Iterator<BytesWritable> itr = value.iterator();
                    BytesWritable wr =;
                    context.write(new BytesWritable(key.copyBytes()),;

Main Class

    public class DicomIndexer{
        public static void main(String[] argss) throws Exception{
            String args[] = {"file:///home/b3ds/storage/dd","hdfs://"};
        public static void run(String[] args) throws Exception {
            //Initialize the Hadoop job and set the jar as well as the name of the Job
            Configuration conf = new Configuration();
            Job job = new Job(conf, "WordCount");
    //        job.getConfiguration().set("mapreduce.output.basename", "hi");
            WholeFileInputFormat.setInputPaths(job, new Path(args[0]));
            FileOutputFormat.setOutputPath(job, new Path(args[1]));

So I completely clueless what do to do. Some of the link says it is not possible as Mapreduce works on <Key,value> pair and some says to use NullWritable. So far I have tried NullWritable, SequenceFileOutputFormat, but none of them working.

Don't have an account?
Coming from Hortonworks? Activate your account here