Support Questions
Find answers, ask questions, and share your expertise

Vertex did not succeed due to OWN_TASK_FAILURE error from Hive while using HBase Ser-De

Contributor

Hi,

We are using org.apache.hadoop.hive.hbase.HBaseStorageHandler to insert data in HBase using Hive external table for a PoC but everytime we are facing below error. Need your help in resolvinf this issue.

Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:50, Vertex vertex_1484566407737_0004_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)

Hadoop distribution : HDP 2.4

File Size : 2GB

Scripts attached.script.txt

6 REPLIES 6

Contributor

This PoC is part of CDC strategy in HBase. We have other 2 strategies also like using importtsv and creating HFiles. But not sure that why this above issue we are facing.

Really looking for your help to address this issue.

Look at the logs from the YARN container which corresponds to that Hive vertex. You can find these logs via the YARN ResourceManager web UI.

Contributor

We have tried with same file and same strategy in a small dev cluster (2 nodes) with HDP 2.5.3 and it was done successfull. Is this an issue of HDP version? Please, need your urgent help here.

Also, can't find a lot in the YARN logs, there are just INFO messages.

Really looking for help and advice.

Contributor

Hi Guys -- looking for your reply. Can you please guide me?

Expert Contributor

Ideally, just before that OWN failure log, there should be an exception or error message about some task for vertex with id 1484566407737_0004_1. That could give more info. Even if more info is not there, you will be able to find the task attempt that actually failed. That task attempt can show you which machine and YARN container is ran on. Sometimes the logs dont have the error because it logged into stderr. In that case, the stderr from the containers YARN logs may show the error.

New Contributor

Hi rajdip,

I am having the same issue when loading the data in to the Hbase. Did your error got resolved or you still waiting for the solution? Do you have any idea of mapping the hbase column mappings with multiple row keys?

; ;