Support Questions

Find answers, ask questions, and share your expertise

INSERT OVERWRITE from HBase External Table into Hive Managed Table.

avatar
Contributor

Hello!

I'm trying to insert into a managed table from an external table created with hbase structure. 

insert overwrite table managed_ml select key, cf1_id , cf1_name from c_0external_ml;

My CDP contains 3 nodes so my question is If I need to make a copy of hbase-site.xml into /etc/hive/conf so Hive is able to make the connection to HBase in order to populate my Hive table?

Currently the insert is failing and the log shows that sentence is trying to call a different node instead the one where the tables are located. 

Thanks in advance!!

 

6 REPLIES 6

avatar
Master Collaborator

Hi @Marks_08 

To insert into a managed table from an external table created with HBase structure in CDP, you need to ensure that Hive can properly connect to HBase. This typically involves making sure that the Hive service is configured to access HBase correctly. One common issue is that the necessary configuration files, such as hbase-site.xml, are not accessible to Hive, leading to connection issues.

Here’s what you can do to address this:

1. Copy hbase-site.xml to Hive Configuration Directory

You need to copy the hbase-site.xml file to the Hive configuration directory. This file contains the necessary configuration for Hive to connect to HBase.

sudo cp /etc/hbase/conf/hbase-site.xml /etc/hive/conf/

2. Verify HBase Configuration

Ensure that the hbase-site.xml file contains the correct configuration and points to the correct HBase nodes. The key configurations to check are:

  • hbase.zookeeper.quorum
  • hbase.zookeeper.property.clientPort

These settings should correctly point to the Zookeeper quorum and client port used by your HBase cluster.

3. Restart Hive Service

After copying the hbase-site.xml file, you might need to restart the Hive service to ensure it picks up the new configuration.

4. Check Hive and HBase Connectivity

Make sure that the Hive service can properly communicate with HBase by running a simple query that accesses HBase data through Hive.

 

Regards,

Chethan YM

avatar
Contributor

Thanks @ChethanYM 

I have completed the 4 steps you suggested and still failing. 

I also executed:

  • with tmp_table as (select * from external_habse_table) select * from tmp_table;

And tmp_table shows the results, but the issue is persisting the information in a managed table. 

Any other idea?

 

 

 

avatar
Master Collaborator
  • Is the query failed in compilation stage / execution stage?
  • Could you please share the complete stack-trace of the query failure?

avatar
Contributor

Hi @ggangadharan 

Query is failing in execution stage.
Here is the complete stack-trace:

Container: container_e56_1723487266861_0004_01_000001 on host11.com:8041
LogAggregationType: LOCAL
=============================================================================================
LogType:syslog_dag_1723487266861_0004_2
LogLastModifiedTime:Mon Aug 12 13:24:12 -0700 2024
LogLength:98347
LogContents:
2024-08-12 13:23:23,676 [INFO] [IPC Server handler 1 on 46167] |app.DAGAppMaster|: Running DAG: insert into table managed_id_name (sele...a) (Stage-1), callerContext={ context=HIVE, callerType=HIVE_QUERY_ID, callerId=hive_20240812132322_49cb35cb-859c-43a3-9bdc-8d069716bf38 }
2024-08-12 13:23:23,698 [INFO] [IPC Server handler 1 on 46167] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_SUBMITTED]: dagID=dag_1723487266861_0004_2, submitTime=1723494203674, queueName=default
2024-08-12 13:23:23,705 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Default container context for vertex_1723487266861_0004_2_00 [Map 1]=LocalResources: [[ name=hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar" } size: 43638473 timestamp: 1723487505772 type: FILE visibility: PRIVATE],[ name=hive-exec-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec-3.1.3000.7.1.7.0-551.jar" } size: 45604768 timestamp: 1723487315123 type: FILE visibility: PRIVATE],[ name=hive-exec.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec.jar" } size: 45604768 timestamp: 1723487315601 type: FILE visibility: PRIVATE],[ name=tez-conf.pb, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9/.tez/application_1723487266861_0004/tez-conf.pb" } size: 137728 timestamp: 1723494184859 type: FILE visibility: APPLICATION],[ name=hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar" } size: 1715959 timestamp: 1723487505901 type: FILE visibility: PRIVATE],[ name=tezlib, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/tez/0.9.1.7.1.7.0-551/tez.tar.gz" } size: 293092553 timestamp: 1691791773812 type: ARCHIVE visibility: PUBLIC],[ name=hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/hive/.hiveJars/hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar" } size: 45604768 timestamp: 1691791898900 type: FILE visibility: PRIVATE],[ name=htrace-core4-4.2.0-incubating.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/htrace-core4-4.2.0-incubating.jar" } size: 1506370 timestamp: 1723487505807 type: FILE visibility: PRIVATE],[ name=hive-hbase-handler-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-hbase-handler-3.1.3000.7.1.7.0-551.jar" } size: 118906 timestamp: 1723487505370 type: FILE visibility: PRIVATE],[ name=hadoop-common-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-common-3.1.1.7.1.7.0-551.jar" } size: 4241824 timestamp: 1723487505866 type: FILE visibility: PRIVATE],], environment: [[ SHELL=/bin/bash ],[ LD_LIBRARY_PATH=$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$HADOOP_COMMON_HOME/lib/native/ ],[ HADOOP_ROOT_LOGGER=INFO,CLA ],[ CLASSPATH=$PWD:$PWD/*:$PWD/tezlib/*:$PWD/tezlib/lib/*: ],], credentials(token kinds): [HBASE_AUTH_TOKEN,HDFS_DELEGATION_TOKEN,tez.job,], javaOpts: -server -Djava.net.preferIPv4Stack=true -XX:+PrintGCDetails -verbose:gc -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp -server -Djava.net.preferIPv4Stack=true -XX:NewRatio=8 -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+PrintGCDetails -verbose:gc -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA , vertex: vertex_1723487266861_0004_2_00 [Map 1], Default Resources=<memory:4096, vCores:1>
2024-08-12 13:23:23,706 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Setting 1 additional inputs for vertexvertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:23:23,706 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Using ExecutionContext from Vertex for Vertex Map 1
2024-08-12 13:23:23,706 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Running vertex: vertex_1723487266861_0004_2_00 [Map 1] : TaskScheduler=0:TezYarn, ContainerLauncher=0:TezYarn, TaskCommunicator=0:TezYarn
2024-08-12 13:23:23,708 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Default container context for vertex_1723487266861_0004_2_01 [Reducer 2]=LocalResources: [[ name=hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar" } size: 43638473 timestamp: 1723487505772 type: FILE visibility: PRIVATE],[ name=hive-exec-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec-3.1.3000.7.1.7.0-551.jar" } size: 45604768 timestamp: 1723487315123 type: FILE visibility: PRIVATE],[ name=hive-exec.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec.jar" } size: 45604768 timestamp: 1723487315601 type: FILE visibility: PRIVATE],[ name=tez-conf.pb, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9/.tez/application_1723487266861_0004/tez-conf.pb" } size: 137728 timestamp: 1723494184859 type: FILE visibility: APPLICATION],[ name=hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar" } size: 1715959 timestamp: 1723487505901 type: FILE visibility: PRIVATE],[ name=tezlib, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/tez/0.9.1.7.1.7.0-551/tez.tar.gz" } size: 293092553 timestamp: 1691791773812 type: ARCHIVE visibility: PUBLIC],[ name=hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/hive/.hiveJars/hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar" } size: 45604768 timestamp: 1691791898900 type: FILE visibility: PRIVATE],[ name=htrace-core4-4.2.0-incubating.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/htrace-core4-4.2.0-incubating.jar" } size: 1506370 timestamp: 1723487505807 type: FILE visibility: PRIVATE],[ name=hive-hbase-handler-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-hbase-handler-3.1.3000.7.1.7.0-551.jar" } size: 118906 timestamp: 1723487505370 type: FILE visibility: PRIVATE],[ name=hadoop-common-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-common-3.1.1.7.1.7.0-551.jar" } size: 4241824 timestamp: 1723487505866 type: FILE visibility: PRIVATE],], environment: [[ SHELL=/bin/bash ],[ LD_LIBRARY_PATH=$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$HADOOP_COMMON_HOME/lib/native/ ],[ HADOOP_ROOT_LOGGER=INFO,CLA ],[ CLASSPATH=$PWD:$PWD/*:$PWD/tezlib/*:$PWD/tezlib/lib/*: ],], credentials(token kinds): [HBASE_AUTH_TOKEN,HDFS_DELEGATION_TOKEN,tez.job,], javaOpts: -server -Djava.net.preferIPv4Stack=true -XX:+PrintGCDetails -verbose:gc -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp -server -Djava.net.preferIPv4Stack=true -XX:NewRatio=8 -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+PrintGCDetails -verbose:gc -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA , vertex: vertex_1723487266861_0004_2_01 [Reducer 2], Default Resources=<memory:4096, vCores:1>
2024-08-12 13:23:23,708 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Setting 1 additional outputs for vertex vertex_1723487266861_0004_2_01 [Reducer 2]
2024-08-12 13:23:23,708 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Using ExecutionContext from Vertex for Vertex Reducer 2
2024-08-12 13:23:23,709 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Running vertex: vertex_1723487266861_0004_2_01 [Reducer 2] : TaskScheduler=0:TezYarn, ContainerLauncher=0:TezYarn, TaskCommunicator=0:TezYarn
2024-08-12 13:23:23,713 [INFO] [IPC Server handler 1 on 46167] |impl.DAGImpl|: Using DAG Scheduler: org.apache.tez.dag.app.dag.impl.DAGSchedulerNaturalOrder
2024-08-12 13:23:23,713 [INFO] [IPC Server handler 1 on 46167] |tez.Utils|: Generating DAG graphviz file, dagId=dag_1723487266861_0004_2, filePath=/yarn/container-logs/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/dag_1723487266861_0004_2_priority.dot
2024-08-12 13:23:23,714 [INFO] [IPC Server handler 1 on 46167] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_INITIALIZED]: dagID=dag_1723487266861_0004_2, initTime=1723494203714
2024-08-12 13:23:23,714 [INFO] [IPC Server handler 1 on 46167] |impl.DAGImpl|: dag_1723487266861_0004_2 transitioned from NEW to INITED due to event DAG_INIT
2024-08-12 13:23:23,714 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Added additional resources : [[]] to classpath
2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_STARTED]: dagID=dag_1723487266861_0004_2, startTime=1723494203714
2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: dag_1723487266861_0004_2 transitioned from INITED to RUNNING due to event DAG_START
2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Root Inputs exist for Vertex: Map 1 : {a={InputName=a}, {Descriptor=ClassName=org.apache.tez.mapreduce.input.MRInputLegacy, hasPayload=true}, {ControllerDescriptor=ClassName=org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator, hasPayload=false}}
2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Starting root input initializer for input: a, with class: [org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator]
2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Setting vertexManager to RootInputVertexManager for vertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:23:23,729 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Num tasks is -1. Expecting VertexManager/InputInitializers/1-1 split to set #tasks for the vertex vertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:23:23,729 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Vertex will initialize from input initializer. vertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:23:23,729 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Starting 1 inputInitializers for vertex vertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:23:23,741 [INFO] [App Shared Pool - #1] |exec.Utilities|: PLAN PATH = hdfs://host14.com:8020/tmp/hive/hive/8a99b093-53f4-455b-8224-5f6355e960ed/hive_2024-08-12_13-23-22_924_2556685397081991603-9/hive/_tez_scratch_dir/2ea04305-4390-4a81-a6c2-8376f90dda7f/map.xml
2024-08-12 13:23:23,742 [INFO] [App Shared Pool - #1] |exec.SerializationUtilities|: Deserializing MapWork using kryo
2024-08-12 13:23:23,765 [INFO] [App Shared Pool - #1] |exec.Utilities|: Deserialized plan (via RPC) - name: Map 1 size: 8.28KB
2024-08-12 13:23:23,765 [INFO] [App Shared Pool - #1] |tez.HiveSplitGenerator|: SplitGenerator using llap affinitized locations: false
2024-08-12 13:23:23,766 [INFO] [App Shared Pool - #1] |tez.HiveSplitGenerator|: SplitLocationProvider: org.apache.hadoop.hive.ql.exec.tez.Utils$1@7aebc17b
2024-08-12 13:23:23,767 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_00 [Map 1] transitioned from NEW to INITIALIZING due to event V_INIT
2024-08-12 13:23:23,767 [INFO] [App Shared Pool - #0] |dag.RootInputInitializerManager|: Starting InputInitializer for Input: a on vertex vertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:23:23,768 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Setting vertexManager to ShuffleVertexManager for vertex_1723487266861_0004_2_01 [Reducer 2]
2024-08-12 13:23:23,768 [INFO] [App Shared Pool - #0] |tez.HiveSplitGenerator|: GenerateConsistentSplitsInHive=true
2024-08-12 13:23:23,769 [INFO] [App Shared Pool - #0] |tez.HiveSplitGenerator|: The preferred split size is 16777216
2024-08-12 13:23:23,770 [INFO] [App Shared Pool - #0] |exec.Utilities|: PLAN PATH = hdfs://host14.com:8020/tmp/hive/hive/8a99b093-53f4-455b-8224-5f6355e960ed/hive_2024-08-12_13-23-22_924_2556685397081991603-9/hive/_tez_scratch_dir/2ea04305-4390-4a81-a6c2-8376f90dda7f/map.xml
2024-08-12 13:23:23,770 [INFO] [App Shared Pool - #0] |exec.Utilities|: Processing alias a
2024-08-12 13:23:23,770 [INFO] [App Shared Pool - #0] |exec.Utilities|: Adding 1 inputs; the first input is hdfs://host14.com:8020/warehouse/tablespace/external/hive/c_0EXTERNAL_ML_SERDE
2024-08-12 13:23:23,771 [INFO] [App Shared Pool - #0] |io.HiveInputFormat|: hive.io.file.readcolumn.ids = 0,1,2
2024-08-12 13:23:23,772 [INFO] [App Shared Pool - #0] |io.HiveInputFormat|: hive.io.file.readcolumn.names = key,cf1_id,cf1_name
2024-08-12 13:23:23,772 [INFO] [App Shared Pool - #0] |io.HiveInputFormat|: Generating splits for dirs: hdfs://host14.com:8020/warehouse/tablespace/external/hive/c_0EXTERNAL_ML_SERDE
2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |vertexmanager.ShuffleVertexManagerBase|: Settings minFrac: 0.2 maxFrac: 0.4 auto: false desiredTaskIput: 104857600
2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |vertexmanager.ShuffleVertexManager|: minTaskParallelism 1
2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Creating 1 tasks for vertex: vertex_1723487266861_0004_2_01 [Reducer 2]
2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Directly initializing vertex: vertex_1723487266861_0004_2_01 [Reducer 2]
2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_CONFIGURE_DONE]: vertexId=vertex_1723487266861_0004_2_01, reconfigureDoneTime=1723494203782, numTasks=1, vertexLocationHint=null, edgeManagersCount=1, rootInputSpecUpdateCount=0, setParallelismCalledFlag=false
2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Setting up committers for vertex vertex_1723487266861_0004_2_01 [Reducer 2], numAdditionalOutputs=1
2024-08-12 13:23:23,783 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_INITIALIZED]: vertexName=Reducer 2, vertexId=vertex_1723487266861_0004_2_01, initRequestedTime=1723494203767, initedTime=1723494203782, numTasks=1, processorName=org.apache.hadoop.hive.ql.exec.tez.ReduceTezProcessor, additionalInputsCount=0, initGeneratedEventsCount=0, servicePluginInfo=ServicePluginInfo {containerLauncherName=TezYarn, taskSchedulerName=TezYarn, taskCommunicatorName=TezYarn, containerLauncherClassName=org.apache.tez.dag.app.launcher.TezContainerLauncherImpl, taskSchedulerClassName=org.apache.tez.dag.app.rm.YarnTaskSchedulerService, taskCommunicatorClassName=org.apache.tez.dag.app.TezTaskCommunicatorImpl }
2024-08-12 13:23:23,783 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_01 [Reducer 2] transitioned from NEW to INITED due to event V_INIT
2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:zookeeper.version=3.5.5-551-0bb5994437a9a16c597ace352f57516f2582d0f9, built on 08/03/2021 19:51 GMT
2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:host.name=host11.com
2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.version=1.8.0_231
2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.vendor=client Corporation
2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.home=/usr/java/jdk1.8.0_231/jre
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.class.path=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hadoop-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-exec.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/htrace-core4-4.2.0-incubating.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-exec-3.1.3000.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-hbase-handler-3.1.3000.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/hadoop-shim-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-api-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-common-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-runtime-internals-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-runtime-library-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-mapreduce-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-examples-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-dag-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-tests-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-ext-service-tests-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-protobuf-history-plugin-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-history-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-history-with-acls-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-cache-plugin-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-history-with-fs-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-history-parser-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-job-analyzer-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-javadoc-tools-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/hadoop-shim-2.8-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/slf4j-api-1.7.30.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-api-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-lang-2.6.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/guava-28.1-jre.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/failureaccess-1.0.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jsr305-3.0.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/checker-qual-2.8.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/j2objc-annotations-1.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/animal-sniffer-annotations-1.18.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-logging-1.1.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jaxb-api-2.2.11.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/protobuf-java-2.5.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-annotations-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-annotations-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-cli-1.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-math3-3.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/httpclient-4.5.13.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/httpcore-4.4.13.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-codec-1.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-io-2.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-net-3.6.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-collections-3.2.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.servlet-api-3.1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.activation-api-1.2.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-server-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-http-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-util-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-io-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-servlet-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-security-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-util-ajax-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-webapp-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-xml-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jersey-json-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jettison-1.3.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jaxb-impl-2.2.3-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-core-asl-1.9.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-jaxrs-1.9.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-xc-1.9.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-mapper-asl-1.9.13-cloudera.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/log4j-1.2.17-cloudera1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-beanutils-1.9.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-configuration2-2.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-lang3-3.8.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/slf4j-log4j12-1.7.30.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/avro-1.8.2.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/paranamer-2.8.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/snappy-java-1.1.7.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-compress-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/xz-1.8.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/re2j-1.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gson-2.2.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/zookeeper-3.5.5.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/zookeeper-jute-3.5.5.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/audience-annotations-0.5.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-all-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-core-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-pkix-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-asn1-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-util-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-databind-2.10.5.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-core-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/stax2-api-3.1.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/woodstox-core-5.0.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-cloud-storage-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-aws-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/aws-java-sdk-bundle-1.11.901.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/wildfly-openssl-1.0.7.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-azure-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/azure-storage-7.0.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/azure-keyvault-core-1.0.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-azure-datalake-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/azure-data-lake-store-sdk-2.3.6.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gcs-connector-2.1.2.7.1.7.0-551-shaded.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/google-extensions-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/flogger-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/checker-compat-qual-2.5.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/flogger-system-backend-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/flogger-slf4j-backend-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-cloud-bindings-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-shell-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-util-common-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-i18n-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.activation-1.2.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-groovysh-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-templates-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-xml-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-console-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-swing-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-json-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/json-smart-2.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/accessors-smart-1.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/asm-5.0.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-configuration-1.10.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/forbiddenapis-2.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/ranger-raz-hook-abfs-2.1.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/ranger-raz-intg-2.1.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-ozone-filesystem-hadoop3-1.1.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/curator-client-4.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/curator-framework-4.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/curator-recipes-4.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/error_prone_annotations-2.3.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/stax-api-1.0.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-auth-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/nimbus-jose-jwt-7.9.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jcip-annotations-1.0-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-simplekdc-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-client-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-config-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-common-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-crypto-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-util-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/token-provider-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-admin-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-server-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-identity-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-xdr-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-hdfs-client-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/okhttp-2.7.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/okio-1.6.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jersey-core-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jsr311-api-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jersey-client-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/guice-servlet-4.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/guice-4.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.inject-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/aopalliance-1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-module-jaxb-annotations-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jakarta.xml.bind-api-2.3.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jakarta.activation-api-1.2.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-jaxrs-json-provider-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-jaxrs-base-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-client-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-registry-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-daemon-1.0.13.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/dnsjava-2.1.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-collections4-4.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/leveldbjni-all-1.8.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/RoaringBitmap-0.4.9.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/async-http-client-2.12.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/async-http-client-netty-utils-2.12.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-buffer-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-common-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jakarta.activation-1.2.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-codec-http-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-resolver-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-codec-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-handler-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-codec-socks-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-handler-proxy-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-native-epoll-4.1.60.Final-linux-x86_64.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-native-unix-common-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-native-kqueue-4.1.60.Final-osx-x86_64.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/reactive-streams-1.0.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-reactive-streams-2.0.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-mapreduce-client-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-web-proxy-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/ehcache-3.3.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/HikariCP-java7-2.4.12.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/mssql-jdbc-6.2.1.jre7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/bcpkix-jdk15on-1.60.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/metrics-core-3.1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-applicationhistoryservice-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/objenesis-1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/fst-2.50.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/java-util-1.9.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/json-io-2.5.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.1.7.0-551.jar:
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.library.path=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001:/CDH/lib/hadoop/lib/native:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001:/CDH/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/lib/native/:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.io.tmpdir=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tmp
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.compiler=<NA>
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.name=Linux
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.arch=amd64
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.version=5.4.17-2136.321.4.1.el8uek.x86_64
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:user.name=hive
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:user.home=/var/lib/hive
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:user.dir=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.memory.free=1417MB
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.memory.max=1638MB
2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.memory.total=1638MB
2024-08-12 13:23:23,895 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Initiating client connection, connectString=host14.com:2181,host11.com:2181,host12.com:2181 sessionTimeout=90000 watcher=org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient$$Lambda$95/1871934544@473d881c
2024-08-12 13:23:23,898 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |common.X509Util|: Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation
2024-08-12 13:23:23,901 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ClientCnxnSocket|: jute.maxbuffer value is 4194304 Bytes
2024-08-12 13:23:23,907 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ClientCnxn|: zookeeper.request.timeout value is 0. feature enabled=
2024-08-12 13:23:23,915 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c-SendThread(host14.com:2181)] |zookeeper.ClientCnxn|: Opening socket connection to server host14.com/10.243.11.186:2181. Will not attempt to authenticate using SASL (unknown error)
2024-08-12 13:23:23,915 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c-SendThread(host14.com:2181)] |zookeeper.ClientCnxn|: Socket connection established, initiating session, client: /10.243.11.183:52366, server: host14.com/10.243.11.186:2181
2024-08-12 13:23:23,920 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c-SendThread(host14.com:2181)] |zookeeper.ClientCnxn|: Session establishment complete on server host14.com/10.243.11.186:2181, sessionid = 0x10762717081009c, negotiated timeout = 60000
2024-08-12 13:23:24,102 [INFO] [App Shared Pool - #0] |mapreduce.RegionSizeCalculator|: Calculating region sizes for table "external_ml_serde".
2024-08-12 13:23:28,718 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=6, retries=16, started=4467 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see https://s.apache.org/timeout
2024-08-12 13:23:32,738 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=7, retries=16, started=8487 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see https://s.apache.org/timeout
2024-08-12 13:23:33,545 [INFO] [DelayedContainerManager] |rm.YarnTaskSchedulerService|: No taskRequests. Container's idle timeout delay expired or is new. Releasing container, containerId=container_e56_1723487266861_0004_01_000002, containerExpiryTime=1723494213374, idleTimeout=10000, taskRequestsCount=0, heldContainers=1, delayedContainers=0, isNew=false
2024-08-12 13:23:33,548 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:CONTAINER_STOPPED]: containerId=container_e56_1723487266861_0004_01_000002, stoppedTime=1723494213548, exitStatus=0
2024-08-12 13:23:33,549 [INFO] [ContainerLauncher #1] |launcher.TezContainerLauncherImpl|: Stopping container_e56_1723487266861_0004_01_000002
2024-08-12 13:23:33,792 [INFO] [Dispatcher thread {Central}] |container.AMContainerImpl|: Container container_e56_1723487266861_0004_01_000002 exited with diagnostics set to Container failed, exitCode=-105. [2024-08-12 13:23:33.560]Container killed by the ApplicationMaster.
[2024-08-12 13:23:33.574]Container killed on request. Exit code is 143
[2024-08-12 13:23:33.574]Container exited with a non-zero exit code 143.

2024-08-12 13:23:34,545 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: <memory:0, vCores:0> Free: <memory:664576, vCores:263> pendingRequests: 0 delayedContainers: 0 heartbeats: 101 lastPreemptionHeartbeat: 100
2024-08-12 13:23:42,795 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=8, retries=16, started=18544 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see https://s.apache.org/timeout
2024-08-12 13:23:47,094 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: <memory:0, vCores:0> Free: <memory:664576, vCores:263> pendingRequests: 0 delayedContainers: 0 heartbeats: 151 lastPreemptionHeartbeat: 150
2024-08-12 13:23:52,848 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=9, retries=16, started=28597 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see https://s.apache.org/timeout
2024-08-12 13:23:59,644 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: <memory:0, vCores:0> Free: <memory:664576, vCores:263> pendingRequests: 0 delayedContainers: 0 heartbeats: 201 lastPreemptionHeartbeat: 200
2024-08-12 13:24:02,884 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=10, retries=16, started=38633 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see https://s.apache.org/timeout
2024-08-12 13:24:12,195 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: <memory:0, vCores:0> Free: <memory:664576, vCores:263> pendingRequests: 0 delayedContainers: 0 heartbeats: 251 lastPreemptionHeartbeat: 250
2024-08-12 13:24:12,895 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=11, retries=16, started=48644 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see https://s.apache.org/timeout
2024-08-12 13:24:12,940 [ERROR] [App Shared Pool - #0] |io.HiveInputFormat|: Failed; user hive tokens Kind: HBASE_AUTH_TOKEN, Service: 73f1f19c-6191-4a6f-81a0-8cfe6a5876b7, Ident: ((username=hive/host14.com@US.client.COM, keyId=361, issueDate=1723494203628, expirationDate=1724099003628, sequenceNumber=12)), Kind: HDFS_DELEGATION_TOKEN, Service: 10.243.11.186:8020, Ident: (token for hive: HDFS_DELEGATION_TOKEN owner=hive/host14.com@US.client.COM, renewer=yarn, realUser=, issueDate=1723494184828, maxDate=1724098984828, sequenceNumber=27767, masterKeyId=357), Kind: tez.job, Service: application_1723487266861_0004, Ident: 1e 61 70 70 6c 69 63 61 74 69 6f 6e 5f 31 37 32 33 34 38 37 32 36 36 38 36 31 5f 30 30 30 34,
2024-08-12 13:24:12,940 [INFO] [App Shared Pool - #0] |dag.RootInputInitializerManager|: Failed InputInitializer for Input: a on vertex vertex_1723487266861_0004_2_00 [Map 1]
2024-08-12 13:24:12,942 [ERROR] [Dispatcher thread {Central}] |impl.VertexImpl|: Vertex Input: a initializer failed, vertex=vertex_1723487266861_0004_2_00 [Map 1]
org.apache.tez.dag.app.dag.impl.AMUserCodeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:
2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1

at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:188)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$createAndStartInitializing$2(RootInputInitializerManager.java:171)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:
2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1

at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)
at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)
at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)
at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:800)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:768)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:721)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:716)
at org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:114)
at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:78)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.getRegionServersOfTable(RegionSizeCalculator.java:103)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.init(RegionSizeCalculator.java:79)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.<init>(RegionSizeCalculator.java:61)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRegionSizeCalculator(TableInputFormatBase.java:593)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.oneInputSplitPerRegion(TableInputFormatBase.java:294)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:257)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:349)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.access$200(HiveHBaseTableInputFormat.java:68)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:271)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:269)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:269)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:542)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:850)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:250)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$runInitializer$3(RootInputInitializerManager.java:203)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializer(RootInputInitializerManager.java:196)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:177)
... 8 more
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
... 3 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)
at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)
at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:389)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:354)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)
at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:819)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
... 1 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
... 26 more
2024-08-12 13:24:12,951 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_FINISHED]: vertexName=Map 1, vertexId=vertex_1723487266861_0004_2_00, initRequestedTime=1723494203715, initedTime=0, startRequestedTime=1723494203767, startedTime=0, finishTime=1723494252949, timeTaken=1723494252949, status=FAILED, diagnostics=Vertex vertex_1723487266861_0004_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE
Vertex Input: a initializer failed, vertex=vertex_1723487266861_0004_2_00 [Map 1], org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:
2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1

at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)
at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)
at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)
at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:800)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:768)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:721)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:716)
at org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:114)
at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:78)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.getRegionServersOfTable(RegionSizeCalculator.java:103)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.init(RegionSizeCalculator.java:79)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.<init>(RegionSizeCalculator.java:61)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRegionSizeCalculator(TableInputFormatBase.java:593)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.oneInputSplitPerRegion(TableInputFormatBase.java:294)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:257)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:349)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.access$200(HiveHBaseTableInputFormat.java:68)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:271)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:269)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:269)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:542)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:850)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:250)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$runInitializer$3(RootInputInitializerManager.java:203)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializer(RootInputInitializerManager.java:196)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:177)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$createAndStartInitializing$2(RootInputInitializerManager.java:171)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
... 3 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)
at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)
at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:389)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:354)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)
at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:819)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
... 1 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
... 26 more
, counters=Counters: 0, vertexStats=firstTaskStartTime=-1, firstTasksToStart=[ ], lastTaskFinishTime=-1, lastTasksToFinish=[ ], minTaskDuration=-1, maxTaskDuration=-1, avgTaskDuration=-1.0, numSuccessfulTasks=0, shortestDurationTasks=[ ], longestDurationTasks=[ ], vertexTaskStats={numFailedTaskAttempts=0, numKilledTaskAttempts=0, numCompletedTasks=0, numSucceededTasks=0, numKilledTasks=0, numFailedTasks=0}, servicePluginInfo=ServicePluginInfo {containerLauncherName=TezYarn, taskSchedulerName=TezYarn, taskCommunicatorName=TezYarn, containerLauncherClassName=org.apache.tez.dag.app.launcher.TezContainerLauncherImpl, taskSchedulerClassName=org.apache.tez.dag.app.rm.YarnTaskSchedulerService, taskCommunicatorClassName=org.apache.tez.dag.app.TezTaskCommunicatorImpl }
2024-08-12 13:24:12,952 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_00 [Map 1] transitioned from INITIALIZING to FAILED due to event V_ROOT_INPUT_FAILED
2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Vertex vertex_1723487266861_0004_2_00 [Map 1] completed., numCompletedVertices=1, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=0, numVertices=2
2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Checking vertices for DAG completion, numCompletedVertices=1, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=0, numVertices=2, commitInProgress=0, terminationCause=VERTEX_FAILURE
2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: dag_1723487266861_0004_2 transitioned from RUNNING to TERMINATING due to event DAG_VERTEX_COMPLETED
2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Invoking committer abort for vertex, vertexId=vertex_1723487266861_0004_2_01 [Reducer 2]
2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_FINISHED]: vertexName=Reducer 2, vertexId=vertex_1723487266861_0004_2_01, initRequestedTime=1723494203767, initedTime=1723494203782, startRequestedTime=0, startedTime=0, finishTime=1723494252953, timeTaken=1723494252953, status=KILLED, diagnostics=Vertex received Kill in INITED state.
Vertex vertex_1723487266861_0004_2_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE, counters=Counters: 0, vertexStats=firstTaskStartTime=0, firstTasksToStart=[ task_1723487266861_0004_2_01_000000 ], lastTaskFinishTime=-1, lastTasksToFinish=[ ], minTaskDuration=-1, maxTaskDuration=-1, avgTaskDuration=-1.0, numSuccessfulTasks=0, shortestDurationTasks=[ ], longestDurationTasks=[ ], vertexTaskStats={numFailedTaskAttempts=0, numKilledTaskAttempts=0, numCompletedTasks=0, numSucceededTasks=0, numKilledTasks=0, numFailedTasks=0}, servicePluginInfo=ServicePluginInfo {containerLauncherName=TezYarn, taskSchedulerName=TezYarn, taskCommunicatorName=TezYarn, containerLauncherClassName=org.apache.tez.dag.app.launcher.TezContainerLauncherImpl, taskSchedulerClassName=org.apache.tez.dag.app.rm.YarnTaskSchedulerService, taskCommunicatorClassName=org.apache.tez.dag.app.TezTaskCommunicatorImpl }
2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_01 [Reducer 2] transitioned from INITED to KILLED due to event V_TERMINATE
2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Vertex vertex_1723487266861_0004_2_01 [Reducer 2] completed., numCompletedVertices=2, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=1, numVertices=2
2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Checking vertices for DAG completion, numCompletedVertices=2, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=1, numVertices=2, commitInProgress=0, terminationCause=VERTEX_FAILURE
2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1
2024-08-12 13:24:12,959 [INFO] [Dispatcher thread {Central}] |recovery.RecoveryService|: DAG completed, dagId=dag_1723487266861_0004_2, queueSize=0
2024-08-12 13:24:12,969 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_FINISHED]: dagId=dag_1723487266861_0004_2, startTime=1723494203714, finishTime=1723494252955, timeTaken=49241, status=FAILED, diagnostics=Vertex failed, vertexName=Map 1, vertexId=vertex_1723487266861_0004_2_00, diagnostics=[Vertex vertex_1723487266861_0004_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: a initializer failed, vertex=vertex_1723487266861_0004_2_00 [Map 1], org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:
2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1

at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)
at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)
at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)
at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:800)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:768)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:721)
at org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:716)
at org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:114)
at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:78)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.getRegionServersOfTable(RegionSizeCalculator.java:103)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.init(RegionSizeCalculator.java:79)
at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.<init>(RegionSizeCalculator.java:61)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRegionSizeCalculator(TableInputFormatBase.java:593)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.oneInputSplitPerRegion(TableInputFormatBase.java:294)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:257)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:349)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.access$200(HiveHBaseTableInputFormat.java:68)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:271)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:269)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:269)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:542)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:850)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:250)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$runInitializer$3(RootInputInitializerManager.java:203)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializer(RootInputInitializerManager.java:196)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:177)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$createAndStartInitializing$2(RootInputInitializerManager.java:171)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
... 3 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)
at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)
at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)
at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:389)
at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:354)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)
at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:819)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
... 1 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
... 26 more
]
Vertex killed, vertexName=Reducer 2, vertexId=vertex_1723487266861_0004_2_01, diagnostics=[Vertex received Kill in INITED state., Vertex vertex_1723487266861_0004_2_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]
DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1, counters=Counters: 2, org.apache.tez.common.counters.DAGCounter, AM_CPU_MILLISECONDS=4150, AM_GC_TIME_MILLIS=55



 

 

avatar
Master Collaborator

From the Error Stack-trace , it does looks like problem is , connection closed exception between Tez AM application and the HBase server (host12.com).

This connection closure led to a java.net.SocketTimeoutException. In simpler terms, tez AM tried to communicate with HBase but the connection timed out after 60 seconds (callTimeout) because it didn't receive a response within that time frame.

This issue occurred during the initialization of a vertex named "vertex_1723487266861_0004_2_00" within the Tez application.

Possible Reasons

The HBase server might be overloaded, experiencing internal issues, or even down entirely.
Hive might have an incorrect HBase server address or port configured.





 

avatar
Contributor

Thanks @ggangadharan 

As far as I can see HBase is up and running but I found something in the HBase log:

2024-08-13 21:53:30,583 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for hive/HOST@REALM (auth:KERBEROS)
2024-08-13 21:53:30,584 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Connection from xx.xxx.xx.xxx:55106, version=2.2.3.7.1.7.0-551, sasl=true, ugi=hive/HOST@REALM (auth:KERBEROS), service=ClientService
2024-08-13 21:53:30,584 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for hive/HOST@REALM (auth:KERBEROS) for protocol=interface org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingInterface
2024-08-13 21:53:38,853 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39718
2024-08-13 21:53:38,853 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39718
2024-08-13 21:53:39,056 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39720
2024-08-13 21:53:39,056 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39720
2024-08-13 21:53:39,361 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39722
2024-08-13 21:53:39,361 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39722
2024-08-13 21:53:39,869 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39724
2024-08-13 21:53:39,870 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39724
2024-08-13 21:53:40,877 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39726
2024-08-13 21:53:40,877 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39726
2024-08-13 21:53:42,882 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39728
2024-08-13 21:53:42,882 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39728
2024-08-13 21:53:46,219 INFO org.apache.hadoop.hbase.io.hfile.LruBlockCache: totalSize=9.18 MB, freeSize=12.20 GB, max=12.21 GB, blockCount=5, accesses=7481, hits=7461, hitRatio=99.73%, , cachingAccesses=7469, cachingHits=7461, cachingHitsRatio=99.89%, evictions=2009, evicted=0, evictedPerRun=0.0
2024-08-13 21:53:46,914 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39730
2024-08-13 21:53:46,914 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39730
2024-08-13 21:53:50,477 INFO org.apache.hadoop.hbase.ScheduledChore: CompactionThroughputTuner average execution time: 8653 ns.
2024-08-13 21:53:50,572 INFO org.apache.hadoop.hbase.replication.regionserver.Replication: Global stats: WAL Edits Buffer Used=0B, Limit=268435456B

2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for hbase/HOST@REALM (auth:KERBEROS)
2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Connection from xx.xxx.xx.xxx:55174, version=2.2.3.7.1.7.0-551, sasl=true, ugi=hbase/HOST@REALM (auth:KERBEROS), service=ClientService
2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for hbase/HOST@REALM (auth:KERBEROS) for protocol=interface org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingInterface
2024-08-13 21:53:56,136 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping HBase metrics system...
2024-08-13 21:53:56,136 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system stopped.
2024-08-13 21:53:56,638 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties
2024-08-13 21:53:56,641 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2024-08-13 21:53:56,641 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system started


This Warning (WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39730) only appears for the statement:

insert overwrite table managed_ml select key, cf1_id , cf1_name from c_0external_ml;

Others statements like insert into c_0external_ml values (1,2,3); runs perfectly. 

Does this error sound familiar to you???