Member since
06-14-2016
4
Posts
2
Kudos Received
0
Solutions
06-14-2016
05:32 AM
1 Kudo
@Alexander Yau, the error shown here is caused by a mismatch between the value class configured for the job at job submission time and what the reducer is attempting to write for the job output. The exception text indicates it expects IntWritable, but instead received an instance of MapWritable. java.io.IOException: wrong value class: class org.apache.hadoop.io.MapWritable is not class org.apache.hadoop.io.IntWritable At job submission time, the output class is set to IntWritable. job.setOutputValueClass(IntWritable.class); However, the reducer class parameterizes the output value type to MapWritable. public static class IntSumReducer extends Reducer<Text, IntWritable, Text, MapWritable> { Likewise, the reducer logic writes a MapWritable instance to the context. private MapWritable result = new MapWritable();
...
result.put(myurl, new IntWritable(sum));
context.write(mykey, result); To fix this error, you'll need to set up the job submission and the reducer to use the same output value class. Judging from the description you gave for what you're trying to achieve with this job, it sounds like you want MapWritable for the outputs. Therefore, I recommend testing again with the line of code from the job submission changed to this: job.setOutputValueClass(MapWritable.class);
... View more