Support Questions
Find answers, ask questions, and share your expertise

Mapreduce Custom Partition for NullWritable key type in Hadoop

Mapreduce Custom Partition for NullWritable key type in Hadoop

Rising Star

I have a requirement where i need only value from HBase not row-key to write into the output file . For that i have used NullWritable.class as my map-output key type. Now i have to partition my output data based on the columns value .But as we know custom partition works based on key and for that i am getting below exception.

This is where i am getting eception

if (partition < 0 || partition >= partitions) {
          throw new IOException("Illegal partition for " + key + " (" +
              partition + ")");
        }
  Caused by: java.io.IOException: Illegal partition for (null) (40)

Here is my driver code that i am using .

  TableMapReduceUtil.initTableMapperJob(args[0], // input table
           scan, // Scan instance to control CF and attribute selection
           DefaultMapper.class, // mapper class
           NullWritable.class, // mapper output value
           Text.class, // mapper output key
           job);

This is my Partition code

  public class FinancialLineItemPartioner extends Partitioner< NullWritable,Text> {
     public int getPartition(NullWritable key, Text value, int setNumRedTask) {
       String str = key.toString();
       if (str.contains("Japan|^|BUS")) {
         return 0;
       } else if (str.contains("Japan|^|CAS")) {
         return 1;
       } else if (str.contains("Japan|^|CUS")) {
         return 2;
  }else {
  return 3;
  }

Please suggest ..

Note :If i interchange the map-output key/value parameter then my reducer is not working .

3 REPLIES 3

Re: Mapreduce Custom Partition for NullWritable key type in Hadoop

Mapper's key cannot be null, that makes no sense, as you have already discovered. So set your outputKeyClass to Text, and for applicable records emit your HBase record's value (of a certain column, I suppose) as your Mapper's key. Then in your Partitioner fix the type of "key" (Text), and also correct your Reducer to accept Text keys, and ignore values.

Re: Mapreduce Custom Partition for NullWritable key type in Hadoop

Rising Star

I have changed the code Now i am passing Key,Value both as Text but still getting same error .

Re: Mapreduce Custom Partition for NullWritable key type in Hadoop

Okay, now go to your Mapper code, and make sure you don't emit any Mapper key=null items. If it still doesn't work then post your Mapper's code.