Member since
08-01-2017
65
Posts
3
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
26676 | 01-22-2018 10:19 AM | |
3141 | 01-22-2018 10:18 AM | |
2884 | 07-05-2017 02:33 PM | |
3322 | 05-26-2017 09:01 AM |
08-25-2017
12:12 PM
Hello guys, I'm trying to upgrade from hdp-2.4.0 to hdp-2.6.1 Updated ambari to 2.5 and all went ok, all services are green and all service checks run smoothly . When I do the express upgrade to hdp-2.6.1 it gives error in the last part . I've followed this article but when I restart the server it gives this error: Is there any workaround for this? Many thanks in advance. Best regards
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
08-24-2017
12:30 PM
Hey guys I'm hitting the same problem on HDP-2.6.1.0 I've followed your article @Sagar Shimpi But when I restart the server by: ambari-server restart, I get this: I attatch the ambari-server.log ambari-server.txt Can you please help? Many thanks in advance. Best regards
... View more
08-11-2017
09:04 AM
Hello @Dan Zaratsian I'm currently at HDP 2.6.1.0 Followed the instructions on the link you've gave me and used json serde 1.3.7 with dependencies (cannot found the 1.1.4) Created the table and I've used this json captured from flume: flumedata.zip Made the query and the same error persists: java.sql.SQLException: Error while
processing statement: FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed,
vertexName=Map 1, vertexId=vertex_1502379039867_0005_1_00,
diagnostics=[Task failed, taskId=task_1502379039867_0005_1_00_000000,
diagnostics=[TaskAttempt 0 failed, info=[Error: Failure while running
task:java.lang.RuntimeException: java.lang.RuntimeException: Hive
Runtime Error while closing operators
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at
org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Hive Runtime Error while closing
operators
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:370)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:164)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error
20003]: An error occurred when trying to close the Operator running your
custom script.
at
org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:560)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:346)
... 15 more
], TaskAttempt 1 failed, info=[Error: Failure while running
task:java.lang.RuntimeException: java.lang.RuntimeException: Hive
Runtime Error while closing operators
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at
org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Hive Runtime Error while closing
operators
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:370)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:164)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error
20003]: An error occurred when trying to close the Operator running your
custom script.
at
org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:560)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:346)
... 15 more
], TaskAttempt 2 failed, info=[Error: Failure while running
task:java.lang.RuntimeException: java.lang.RuntimeException: Hive
Runtime Error while closing operators
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at
org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Hive Runtime Error while closing
operators
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:370)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:164)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error
20003]: An error occurred when trying to close the Operator running your
custom script.
at
org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:560)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:346)
... 15 more
], TaskAttempt 3 failed, info=[Error: Failure while running
task:java.lang.RuntimeException: java.lang.RuntimeException: Hive
Runtime Error while closing operators
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at
org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Hive Runtime Error while closing
operators
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:370)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:164)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error
20003]: An error occurred when trying to close the Operator running your
custom script.
at
org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:560)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:346)
... 15 more
]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1
killedTasks:0, Vertex vertex_1502379039867_0005_1_00 [Map 1]
killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to
VERTEX_FAILURE. failedVertices:1 killedVertices:0
... View more
08-04-2017
08:43 AM
Hello @Dan Zaratsian and many thanks once more. Sorry you're right, I've forgotten to select best answer. I can make queries in the hive table with this serde, except when I query the text column. For instance, if I use your script with ID and Lang columns it runs smoothly. If I query the text column, it gives error. I'll try to update HDP to version 2.6.1.0 and then let you know. Best regards
... View more
08-01-2017
09:00 AM
@Dan Zaratsian I don't see how updating HDP can solve my problem. I think the problem maybe of the serde when creating the table. I'm currently using this: ADD JAR hdfs://192.168.0.73:8020/user/admin/oozie-workflows/lib/json-serde-1.3.8-jar-with-dependencies.jar;
CREATE EXTERNAL TABLE tweets (
id bigint,
created_at string,
source STRING,
favorited BOOLEAN,
retweeted_status STRUCT<
text:STRING,
user:STRUCT<screen_name:STRING,name:STRING>,
retweet_count:INT>,
entities STRUCT<
urls:ARRAY<STRUCT<expanded_url:STRING>>,
user_mentions:ARRAY<STRUCT<screen_name:STRING,name:STRING>>,
hashtags:ARRAY<STRUCT<text:STRING>>>,
lang string,
retweet_count int,
text string,
user STRUCT<
screen_name:STRING,
name:STRING,
friends_count:INT,
followers_count:INT,
statuses_count:INT,
verified:BOOLEAN,
utc_offset:INT,
time_zone:STRING>
)
PARTITIONED BY (datehour int)
ROW FORMAT SERDE
'org.apache.hive.hcatalog.data.JsonSerDe'
WITH SERDEPROPERTIES ( "ignore.malformed.json" = "true")
LOCATION
'hdfs://192.168.0.73:8020/user/flume/tweets'
... View more
07-31-2017
04:14 PM
@Dan Zaratsian After some days I've reached the conclusion that the problem must be on the Json Serde because when I upload your table into Hive it works ok. I'm currently using json-serde-1.3.8-jar-with-dependencies.jar ... Many thanks in advance. Best regards
... View more
07-25-2017
01:31 PM
@Dan Zaratsian Found the error: createdAt, screenName, text = line.replace('\n',' ').split('\t') It only works when I have only 1 variable. With more than 1 it crashes. Is there any alternative to the split('\t') ?
... View more
07-24-2017
01:39 PM
@Dan Zaratsian
Only got your 2nd example working... that to check if the prolem was with the syntax or system misconfiguration. With the function you gave me i cannot check the tweets table, it gives the row error. Can you please check this code one last time? import sys
for line in sys.stdin:
id, text = line.replace('\n',' ').split('\t')
positive = set(["love", "good", "great", "happy", "cool", "best", "awesome", "nice", "helpful", "enjoyed"])
negative = set(["hate", "bad", "stupid", "terrible", "unhappy"])
words = text.split()
word_count = len(words)
positive_matches = [1 for word in words if word in positive]
negative_matches = [-1 for word in words if word in negative]
st = sum(positive_matches) + sum(negative_matches)
if st > 0:
print ('\t'.join([text, 'positive', str(word_count)]))
elif st < 0:
print ('\t'.join([text, 'negative', str(word_count)]))
else:
print ('\t'.join([text, 'neutral', str(word_count)]))
Best regards
... View more
07-24-2017
10:50 AM
@Dan Zaratsian
I was able to reproduce the code you've gave me. It runs ok. The problem with the twitter table persists. If I add the Json serde in the query I get error processing row, if not, it hangs a lot of time and returns map operator run initialized . I think I have to add the Serde and therefore the problem is not from here. Here's the code: import sys
for line in sys.stdin:
text = line.split('\t')
positive = set(["love", "good", "great", "happy", "cool", "best", "awesome", "nice", "helpful", "enjoyed"])
negative = set(["hate", "bad", "stupid", "terrible", "unhappy"])
words = text.split()
word_count = len(words)
positive_matches = [1 for word in words if word in positive]
negative_matches = [-1 for word in words if word in negative]
st = sum(positive_matches) + sum(negative_matches)
if st > 0:
print ('\t'.join([text, 'positive', str(word_count)]))
elif st < 0:
print ('\t'.join([text, 'negative', str(word_count)]))
else:
print ('\t'.join([text, 'neutral', str(word_count)]))
I was able to run the tweets table with this test: import sys
for line in sys.stdin:
print ('\t'.join([line]))
ADD JAR /tmp/json-serde-1.3.8-jar-with-dependencies.jar;
ADD FILE /tmp/teste.py;
SELECT
TRANSFORM (text)
USING 'python teste.py'
FROM tweets;
... View more
07-18-2017
08:24 AM
@Dan Zaratsian I'm not using Spark at the moment since I'm running the job directly on Hive for troubleshooting Spark ver - 1.6.x.2.4 Hive - 1.2.1.2.4 HDP - 2.4.0.0
... View more