Member since
12-02-2019
19
Posts
4
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1940 | 04-08-2020 06:13 AM | |
1789 | 03-06-2020 08:28 AM |
07-08-2020
08:27 AM
I am not sure,I believe you are hitting https://issues.apache.org/jira/browse/HIVE-21799 This fails at compile phase .
... View more
04-14-2020
07:54 AM
1 Kudo
Hello, @Eric_B , thanks for the reply, after few search me to I have no permanente solution but for overcome this issue we work with insert (for all data) and make update gestion with materialized view hive. And work fine and fast for my usecase. Best Regards
... View more
04-08-2020
06:13 AM
1 Kudo
Hi, after some researches i have find a solution to this issues. The problem was from the Hive table definition for storing data. I was defining some properties of my table like this : hive.createTable("res_idos_0")
.ifNotExists()
.prop("serialization.encoding","UTF-8")
.prop("escape.delim" , "\t")
.column("t_date","TIMESTAMP") But when we are in writeStream and we use special characters, the use of property escape.delim is note supported and we can't save characters correctly. So, i have removed the property escape.delim in my hive table definition and i had also added this line in my code for being certain that file save in HDFS have the right encoding. System.setProperty("file.encoding", "UTF-8")
... View more
02-17-2020
08:21 AM
Hi, @rajkumar_singh thanks to your advice I found my file hiveserver2 Interactive.log and the error is in task 4-5. I noticed 2 things : - First, unfortunately before the first NPE error in the file hiveserver2 Interactive.log it is another NPE that we find - Secondly, in task 3 we have - Cannot get a table snapshot for factmachine_mv but I don't thinks that is the error's source. HiveServeur2Interactive.log: 2020-02-17T15:11:13,525 INFO [HiveServer2-Background-Pool: Thread-5750]: FileOperations (FSStatsAggregator.java:aggregateStats(101)) - Read stats for : dwh_dev.factmachine_mv/ numRows 1231370
2020-02-17T15:11:13,525 INFO [HiveServer2-Background-Pool: Thread-5750]: FileOperations (FSStatsAggregator.java:aggregateStats(101)) - Read stats for : dwh_dev.factmachine_mv/ rawDataSize 552885130
2020-02-17T15:11:13,525 WARN [HiveServer2-Background-Pool: Thread-5750]: metadata.Hive (Hive.java:alterTable(778)) - Cannot get a table snapshot for factmachine_mv
2020-02-17T15:11:13,609 INFO [HiveServer2-Background-Pool: Thread-5750]: stats.BasicStatsTask (SessionState.java:printInfo(1227)) - Table dwh_dev.factmachine_mv stats: [numFiles=2, numRows=1231370, totalSize=1483151, rawDataSize=552885130]
2020-02-17T15:11:13,609 INFO [HiveServer2-Background-Pool: Thread-5750]: stats.BasicStatsTask (BasicStatsTask.java:aggregateStats(271)) - Table dwh_dev.factmachine_mv stats: [numFiles=2, numRows=1231370, totalSize=1483151, rawDataSize=552885130]
2020-02-17T15:11:13,620 INFO [HiveServer2-Background-Pool: Thread-5750]: mapred.FileInputFormat (FileInputFormat.java:listStatus(259)) - Total input files to process : 1
2020-02-17T15:11:14,881 INFO [HiveServer2-Background-Pool: Thread-5750]: ql.Driver (Driver.java:launchTask(2710)) - Starting task [Stage-5:DDL] in serial mode
2020-02-17T15:11:14,902 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (CalcitePlanner.java:genLogicalPlan(385)) - Starting generating logical plan
2020-02-17T15:11:14,904 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:genResolvedParseTree(12232)) - Completed phase 1 of Semantic Analysis
2020-02-17T15:11:14,904 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(2113)) - Get metadata for source tables
2020-02-17T15:11:14,904 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(2244)) - Get metadata for subqueries
2020-02-17T15:11:14,905 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(2113)) - Get metadata for source tables
2020-02-17T15:11:14,935 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(2244)) - Get metadata for subqueries
2020-02-17T15:11:14,936 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(2268)) - Get metadata for destination tables
2020-02-17T15:11:14,936 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(2268)) - Get metadata for destination tables
2020-02-17T15:11:14,944 INFO [HiveServer2-Background-Pool: Thread-5750]: ql.Context (Context.java:getMRScratchDir(551)) - New scratch dir is hdfs://lvdcluster/tmp/hive/hive/c0988fe9-1a5e-4a6a-9ddd-7b782d0cdba1/hive_2020-02-17_15-11-14_897_3869104234201454658-21
2020-02-17T15:11:14,945 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.CalcitePlanner (SemanticAnalyzer.java:genResolvedParseTree(12237)) - Completed getting MetaData in Semantic Analysis
2020-02-17T15:11:14,945 INFO [HiveServer2-Background-Pool: Thread-5750]: parse.BaseSemanticAnalyzer (CalcitePlanner.java:canCBOHandleAst(875)) - Not invoking CBO because the statement has lateral views
2020-02-17T15:11:14,946 ERROR [HiveServer2-Background-Pool: Thread-5750]: exec.TaskRunner (TaskRunner.java:runSequential(108)) - Error in executeTask
java.lang.NullPointerException: null
at org.apache.calcite.plan.RelOptMaterialization.<init>(RelOptMaterialization.java:68) ~[calcite-core-1.16.0.3.1.4.0-315.jar:1.16.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.metadata.HiveMaterializedViewsRegistry.addMaterializedView(HiveMaterializedViewsRegistry.java:235) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.metadata.HiveMaterializedViewsRegistry.createMaterializedView(HiveMaterializedViewsRegistry.java:187) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.exec.MaterializedViewTask.execute(MaterializedViewTask.java:59) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:212) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:103) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2712) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2383) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2055) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1753) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1747) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) [hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) [hive-service-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) [hive-service-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:324) [hive-service-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_112]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_112]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) [hadoop-common-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:342) [hive-service-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2020-02-17T15:11:14,947 INFO [HiveServer2-Background-Pool: Thread-5750]: reexec.ReOptimizePlugin (ReOptimizePlugin.java:run(70)) - ReOptimization: retryPossible: false
2020-02-17T15:11:14,948 ERROR [HiveServer2-Background-Pool: Thread-5750]: ql.Driver (SessionState.java:printError(1250)) - FAILED: Hive Internal Error: org.apache.hadoop.hive.ql.metadata.HiveException(Error while invoking FailureHook. hooks: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.reexec.ReExecutionOverlayPlugin$LocalHook.run(ReExecutionOverlayPlugin.java:45)
at org.apache.hadoop.hive.ql.HookRunner.invokeGeneralHook(HookRunner.java:296)
at org.apache.hadoop.hive.ql.HookRunner.runFailureHooks(HookRunner.java:283)
at org.apache.hadoop.hive.ql.Driver.invokeFailureHooks(Driver.java:2664)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2434)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2055)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1753)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1747)
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226)
at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87)
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:324)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:342)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
) For the moment, my solution is to use use a simple view. Because I noticed that even if the materilized view doesn't work, a simple view work with this query.
... View more