Member since 
    
	
		
		
		08-20-2020
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                4
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		03-09-2022
	
		
		04:25 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,  Sorry for the late answer.  I can't paste all the informations on the chat. Do you have some specific values that i can share ? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-15-2022
	
		
		06:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thank you very much for your answer. I'm really stuck on this.  My have Hive is installed (with LLAP on) on this cluster HDP-3.1.0.0 (3.1.0.0-78).  I also noticed, that i cannot acces the information_schema vue due to the following error:  SQL Error [40000] [42000]: Error while compiling statement: FAILED: SemanticException Unable to fetch table dbs. org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error  at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:88)  at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1951)  at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1427)  at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3100)  at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154)  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966)  at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)  at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)  at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)  at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-14-2022
	
		
		05:06 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,  I'm currently facing an issue concerning apache hive specifically in relation with surrogate key function.  In ordre to recreate the probleme i tried to use the exemple on cloudera that uses surrogate key:  <https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/using-hiveql/content/hive_surrogate_keys.html>  I have created the table students_v2 as shown on the website, and tried to insert the following line:  `insert into students_v2(row_id, name, dorm) values (1, "ami", "ne")`  However i receive the following error:  ```  SQL Error [2] [08S01]: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1644246846031_0051_1_00, diagnostics=[Task failed, taskId=task_1644246846031_0051_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1644246846031_0051_1_00_000000_0:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)  at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)  at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:108)  at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:41)  at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:77)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)  at java.lang.Thread.run(Thread.java:750)  Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:426)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)  ... 16 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:569)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)  ... 19 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Could not obtain Write ID for the surrogate_key function  at org.apache.hadoop.hive.ql.udf.generic.GenericUDFSurrogateKey.evaluate(GenericUDFSurrogateKey.java:120)  at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:197)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)  at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)  at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)  ... 20 more  ], TaskAttempt 1 failed, info=[Error: Error while running task ( failure ) : attempt_1644246846031_0051_1_00_000000_1:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)  at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)  at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:108)  at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:41)  at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:77)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)  at java.lang.Thread.run(Thread.java:750)  Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:426)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)  ... 16 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:569)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)  ... 19 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Could not obtain Write ID for the surrogate_key function  at org.apache.hadoop.hive.ql.udf.generic.GenericUDFSurrogateKey.evaluate(GenericUDFSurrogateKey.java:120)  at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:197)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)  at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)  at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)  ... 20 more  ], TaskAttempt 2 failed, info=[Error: Error while running task ( failure ) : attempt_1644246846031_0051_1_00_000000_2:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)  at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)  at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:108)  at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:41)  at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:77)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)  at java.lang.Thread.run(Thread.java:750)  Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:426)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)  ... 16 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:569)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)  ... 19 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Could not obtain Write ID for the surrogate_key function  at org.apache.hadoop.hive.ql.udf.generic.GenericUDFSurrogateKey.evaluate(GenericUDFSurrogateKey.java:120)  at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:197)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)  at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)  at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)  ... 20 more  ], TaskAttempt 3 failed, info=[Error: Error while running task ( failure ) : attempt_1644246846031_0051_1_00_000000_3:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)  at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)  at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)  at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)  at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:108)  at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:41)  at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:77)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)  at java.lang.Thread.run(Thread.java:748)  Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:426)  at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)  ... 16 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:569)  at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)  ... 19 more  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Could not obtain Write ID for the surrogate_key function  at org.apache.hadoop.hive.ql.udf.generic.GenericUDFSurrogateKey.evaluate(GenericUDFSurrogateKey.java:120)  at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:197)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)  at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)  at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)  at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)  at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:927)  at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)  at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:994)  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:940)  at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)  at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)  ... 20 more  ]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1644246846031_0051_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]Vertex killed, vertexName=Reducer 2, vertexId=vertex_1644246846031_0051_1_01, diagnostics=[Vertex received Kill while in RUNNING state., Vertex did not succeed due to OTHER_VERTEX_FAILURE, failedTasks:0 killedTasks:1, Vertex vertex_1644246846031_0051_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1  ``` 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hive
 
        


