Support Questions

Find answers, ask questions, and share your expertise

How to resolve for NULL values when they are coming from source table ?

avatar
New Contributor

Below is a log of the error received. Requesting help identifying the error and the solution to it...

On MR execution engine.

Diagnostic Messages for this Task: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache op.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:257) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerD ption: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator :647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOper java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOper java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperat va:759) at org.apache.hadoop.hive.ql.exec.JoinOperator.endGroup(JoinOperator.java:265) at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:196) ... 7 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 16 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector. ritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:23 at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:29 at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:22 at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer :140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFact ava:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93 at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297)

On TEZ execution engine:

Status: Failed Vertex failed, vertexName=Reducer 6, vertexId=vertex_1508199090281_70488_6_09, diagnostics=[Task failed, taskId=task_1508199090281_70488_6_09_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_0:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ], TaskAttempt 1 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_1:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ], TaskAttempt 2 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_2:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ], TaskAttempt 3 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_3:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1508199090281_70488_6_09 [Reducer 6] killed/failed due to:OWN_TASK_FAILURE] DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Reducer 6, vertexId=vertex_1508199090281_70488_6_09, diagnostics=[Task failed, taskId=task_1508199090281_70488_6_09_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_0:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ], TaskAttempt 1 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_1:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ], TaskAttempt 2 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_2:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ], TaskAttempt 3 failed, info=[Error: Error while running task ( failure ) : attempt_1508199090281_70488_6_09_000000_3:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:289) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:279) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":"anthroma"},"value":{"_col0":"nam-6802","_col1":"APP Corporation","_col2":"Customer","_col3":"Licensed","_col4":"FALSE","_col5":"app.com.au","_col6":"400","_col7":"394","_col8":"2/2/2016","_col9":"1/19/2020","_col10":"2/18/2016","_col11":"11/21/2016","_col12":"1/9/2017","_col13":"7/17/2017","_col14":"Nhan Do","_col16":"7","_col17":"7","_col28":"50","_col29":"1/19/2017","_col30":"1/19/2020","_col31":"103585981","_col37":"FP-AMP-LIC=","_col38":"200296824","_col42":"1/19/2017","_col43":"1/18/2020","_col62":"7141271","_col77":"","_col106":"FALSE","_col107":"FALSE","_col109":"TRUE","_col110":"2","_col111":"OK:CWS Customer","_col119":"Viktor Smolin","_col121":"10/10/2017","_col126":254332105,"_col183":7141271}} at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:357) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:279) ... 16 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:787) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:879) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:647) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:663) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:759) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinObject(CommonMergeJoinOperator.java:316) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:279) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.joinOneGroup(CommonMergeJoinOperator.java:272) at org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator.process(CommonMergeJoinOperator.java:258) at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:348) ... 17 more Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:301) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:714) ... 29 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:295) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeKeyField(HBaseRowSerializer.java:140) at org.apache.hadoop.hive.hbase.DefaultHBaseKeyFactory.serializeKey(DefaultHBaseKeyFactory.java:59) at org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:93) at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:297) ... 30 more ]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1508199090281_70488_6_09 [Reducer 6] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0

1 ACCEPTED SOLUTION

avatar
Super Guru

@Swati Sinha

The exception you are getting is

java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243)

Somehwhere in your data, it is expecting a long and is not getting a long value. Looking at your record in the log file, the only thing that jumps off are the following attributes:

,"_col126":254332105,"_col183":7141271

I think this is a malformed jason and your colon (:) should be in quotes and the values outside just like the rest of your json record. I could be wrong here but this is what it looks like right now.

View solution in original post

2 REPLIES 2

avatar
Super Guru

@Swati Sinha

The exception you are getting is

java.lang.NullPointerException at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableLongObjectInspector.get(WritableLongObjectInspector.java:36) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:243)

Somehwhere in your data, it is expecting a long and is not getting a long value. Looking at your record in the log file, the only thing that jumps off are the following attributes:

,"_col126":254332105,"_col183":7141271

I think this is a malformed jason and your colon (:) should be in quotes and the values outside just like the rest of your json record. I could be wrong here but this is what it looks like right now.

avatar
New Contributor

@mqureshi

Thanks for your inputs 🙂

I would like to add that maybe instead of long value, the field is receiving NULL value. Explaining my problem in detail below:

Stated below is an outline of the code I'm using:

insert overwrite table db.test_tbl named_struct('end_customers_d_party_id',A.end_customers_d_party_id,'dv_cr_party_id',B.dv_cr_party_id,'original_sales_order',A.original_sales_order) as key, case when A.manager_name is not NULL OR A.manager_name <> '' OR length(A.manager_name) > 0 then A.manager_name else '' end as manager_name, case when G.cec_id is not NULL OR G.cec_id <> '' OR length(G.cec_id) > 0 then G.cec_id else '' end as cec_id, case when G.primary_name is not NULL OR G.primary_name <> '' OR length(G.primary_name) > 0 then G.primary_name else '' end as primary_name, case when E.cse_id is not NULL OR E.cse_id <> '' OR length(E.cse_id) > 0 then E.cse_id else '' end as cse_id, case when C.companyname is not NULL OR C.companyname <> '' OR length(C.companyname) > 0 then C.companyname else '' end as companyname, case when A.product_id is not NULL OR A.product_id <> '' OR length(A.product_id) > 0 then A.product_id else '' end as product_id from db.amp_provision C INNER JOIN db.table1 A ON TRIM(C.guid) = TRIM(A.guid) INNER JOIN db.table2 D ON TRIM(C.guid) = TRIM(D.guid) INNER JOIN db.table3 AUL ON TRIM(C.guid) = TRIM(AUL.guid) JOIN db.table4 B ON TRIM (A.original_sales_order) = B.sales_order_num AND B.offer_code= 'X' INNER JOIN db.table5 E ON TRIM (C.guid) = TRIM(E.offer_reference_id) INNER JOIN db.table6 F ON B.dv_cr_party_id = F.cr_party_id AND E.cse_id = F.cs_ent_cust_id AND E.offer_name = 'X'

The issue that happened that column cse_id came as null for one of the persistent customers, because that customer was getting dropped based on the last join E.cse_id = F.cs_ent_cust_id and was not at all present in table5. (The same value is present in all other tables from 1 to 4)

My question now is how can overcome this. I want to persist some customers based on their cse_id; irrespective of its presence in table5 which has high chances of dropping few customers every time its refreshed. Using a LEFT JOIN with table5 causes VERTEX FAILURE in hive run. And the error is posted above.

Kindly help with a sturdy solution out of this. I'm happy to explain the above issue in more detail, if required.

THANKS ALL !!!

🙂 Swati