Member since
02-16-2017
13
Posts
0
Kudos Received
0
Solutions
12-28-2018
07:59 AM
Hi @Jalender , @subhash parise I am able to solve this issue. Based on the log from 'yarn logs -applicationId applicatioon_1545806970486_****', the main issue was following. "java.lang.Exception: java.util.concurrent.ExecutionException: java.lang.VerifyError: Bad return type" This error have caused as the jar I complied had all the dependency library package in it (I am using maven). There were 2 jar file generated when I used "clean install package" in maven. One jar had all the dependency library with file size of ~77mb. Another jar file had only class definition (file sze ~100kb) . I was using the jar will all the dependency library in the Hive cluster. It may have caused this error "java.lang.Exception: java.util.concurrent.ExecutionException: java.lang.VerifyError: Bad return type" as suggested in this post "https://stackoverflow.com/questions/100107/causes-of-getting-a-java-lang-verifyerror". So using the second jar with only class definition, the Insert query with custom UDF was working. Thank you all for the suggestion.
... View more
12-27-2018
10:42 AM
Hi @kerra I am also having the issue when I get vertex failed error when I tried to insert into table with 'Tez' engine in hive. Could you please mention what is the user permission that needs to be set. I am running hive with hdfs user. Thank you.
... View more
12-27-2018
09:35 AM
Hi, The issue is when I use custom UDF (error i posted is from one of the custom UDF. With other custom UDF also there is the same vertex fail error) to insert into the hive table. The UDF is okay as it works when I switch to 'MR' engine. Also query such as "select count(*) from table" work even if the engine is set to 'Tez'. I tried setting parameter, before running the query, as suggested in other post like set hive.execution.engine=tez; set hive.auto.convert.join=true; set hive.auto.convert.join.noconditionaltask=true; set hive.auto.convert.join.noconditionaltask.size=405306368; set hive.vectorized.execution.enabled=true; set hive.vectorized.execution.reduce.enabled =true; set hive.cbo.enable=true; set hive.compute.query.using.stats=true; set hive.stats.fetch.column.stats=true; set hive.stats.fetch.partition.stats=true; set hive.merge.mapfiles =true; set hive.merge.mapredfiles=true; set hive.merge.size.per.task=134217728; set hive.merge.smallfiles.avgsize=44739242; set mapreduce.job.reduce.slowstart.completedmaps=0.8; But it did not work. So is there some other specific 'Tez' parameter that needs to be tuned for the query to work?
... View more
12-27-2018
05:28 AM
Hi I check the log file. The entire log if is big so i copied the error from the log and attached here. The entire log is full of the error as attached here in the log file log-hive.txt
... View more
12-26-2018
11:28 AM
Hi, I am getting the error when try to run hive insert query with UDF. This is the error I get "Vertex failed, vertexName=Map 1, vertexId=vertex_1545806970486_0001_1_00, diagnostics=[Task failed, taskId=task_1545806970486_0001_1_00_000097, diagnostics=[TaskAttempt 0 failed, info=[Container container_e13_1545806970486_0001_01_000107 finished with diagnostics set to [Container completed. ]], TaskAttempt 1 killed, TaskAttempt 2 failed, info=[Container container_e13_1545806970486_0001_01_000136 received a STOP_REQUEST], TaskAttempt 3 failed, info=[Container container_e13_1545806970486_0001_01_000158 finished with diagnostics set to [Container completed. ]], TaskAttempt 4 failed, info=[Container container_e13_1545806970486_0001_01_000254 finished with diagnostics set to [Container completed. ]]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:230, Vertex vertex_1545806970486_0001_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]
Vertex killed, vertexName=Reducer 2, vertexId=vertex_1545806970486_0001_1_01, diagnostics=[Vertex received Kill while in RUNNING state., Vertex did not succeed due to OTHER_VERTEX_FAILURE, failedTasks:0 killedTasks:418, Vertex vertex_1545806970486_0001_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]
DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1545806970486_0001_1_00, diagnostics=[Task failed, taskId=task_1545806970486_0001_1_00_000097, diagnostics=[TaskAttempt 0 failed, info=[Container container_e13_1545806970486_0001_01_000107 finished with diagnostics set to [Container completed. ]], TaskAttempt 1 killed, TaskAttempt 2 failed, info=[Container container_e13_1545806970486_0001_01_000136 received a STOP_REQUEST], TaskAttempt 3 failed, info=[Container container_e13_1545806970486_0001_01_000158 finished with diagnostics set to [Container completed. ]], TaskAttempt 4 failed, info=[Container container_e13_1545806970486_0001_01_000254 finished with diagnostics set to [Container completed. ]]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:230, Vertex vertex_1545806970486_0001_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]Vertex killed, vertexName=Reducer 2, vertexId=vertex_1545806970486_0001_1_01, diagnostics=[Vertex received Kill while in RUNNING state., Vertex did not succeed due to OTHER_VERTEX_FAILURE, failedTasks:0 killedTasks:418, Vertex vertex_1545806970486_0001_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1" "The error does not say much except for the vertex failed" I have already checked the other post which mentioned similar issue but it did not help. https://community.hortonworks.com/questions/48549/hive-vertex-issue.html https://community.hortonworks.com/questions/24730/hive-job-failed-on-tez.html https://community.hortonworks.com/questions/90648/hive-error-vertex-failed.html https://community.hortonworks.com/questions/140266/hive-query-error-with-vertex-failed-on-partitioned.html https://community.hortonworks.com/questions/141485/tez-vertex-failed-due-to-its-own-failuredag-did-no.html https://community.hortonworks.com/questions/222722/hive-query-fails-in-tez-runs-in-mr-mode.html I tried to set configuration as mentioned in one of the post set hive.execution.engine=tez; set hive.auto.convert.join=true; set hive.auto.convert.join.noconditionaltask=true; set hive.auto.convert.join.noconditionaltask.size=405306368; set hive.vectorized.execution.enabled=true; set hive.vectorized.execution.reduce.enabled =true; set hive.cbo.enable=true; set hive.compute.query.using.stats=true; set hive.stats.fetch.column.stats=true; set hive.stats.fetch.partition.stats=true; set hive.merge.mapfiles =true; set hive.merge.mapredfiles=true; set hive.merge.size.per.task=134217728; set hive.merge.smallfiles.avgsize=44739242; set mapreduce.job.reduce.slowstart.completedmaps=0.8; But this also did not work. My table structure is as follows: CREATE TABLE table_name(id string, record ARRAY<ARRAY<string>>)
PARTITIONED BY (dateonly string, place string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '#'
COLLECTION ITEMS TERMINATED BY ','
MAP KEYS TERMINATED BY '!'
LINES TERMINATED BY '\n'
STORED AS SEQUENCEFILE; My query structure is as follows: INSERT OVERWRITE TABLE table_name PARTITION (dateonly='some_date', place ='some_place')
select id, UDF(id1,id2.......) as record from other_table where dateonly = 'some_date' and place = 'some_place' group by id; The thing is if I change execution engine to 'mr' it work but for 'tez' execution engine this error keeps coming. Please kindly help if anyone has any solution for this issue. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hive
12-12-2018
01:34 AM
Thank you very much for the explanation regarding the work around to delete the row.
... View more
12-10-2018
04:30 AM
Hi, Table is stored as TEXTFILE. It is not in ORC format or bucket enable. For such case is there a way around to delete the rows from such tables
... View more
12-07-2018
09:33 AM
I am trying to delete some of the rows from my
hive table which has partitions. This is what I did. delete
from <table_name> where <condition>; However,
I am getting following error. FAILED:
SemanticException [Error 10294]: Attempt to do update or delete using
transaction manager that does not support these operations. Please anyone
suggest why the query is not working. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hive