<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table. in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391692#M247728</link>
    <description>&lt;P&gt;From the Error Stack-trace , it does looks like problem is ,&amp;nbsp;connection closed exception between Tez AM application and the HBase server (host12.com).&lt;BR /&gt;&lt;BR /&gt;This connection closure led to a java.net.SocketTimeoutException. In simpler terms, tez AM tried to communicate with HBase but the connection timed out after 60 seconds (callTimeout) because it didn't receive a response within that time frame.&lt;BR /&gt;&lt;BR /&gt;This issue occurred during the initialization of a vertex named "vertex_1723487266861_0004_2_00" within the Tez application.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;Possible Reasons&lt;BR /&gt;&lt;BR /&gt;&lt;/STRONG&gt;The HBase server might be overloaded, experiencing internal issues, or even down entirely.&lt;BR /&gt;Hive might have an incorrect HBase server address or port configured.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV class="container"&gt;&amp;nbsp;&lt;/DIV&gt;</description>
    <pubDate>Tue, 13 Aug 2024 11:37:41 GMT</pubDate>
    <dc:creator>ggangadharan</dc:creator>
    <dc:date>2024-08-13T11:37:41Z</dc:date>
    <item>
      <title>INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391421#M247612</link>
      <description>&lt;P&gt;Hello!&lt;/P&gt;&lt;P&gt;I'm trying to insert into a managed table from an external table created with hbase structure.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;insert overwrite table managed_ml &lt;/STRONG&gt;&lt;STRONG&gt;select key, cf1_id , cf1_name from c_0external_ml;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;My CDP contains 3 nodes so my question is If&lt;STRONG&gt;&lt;EM&gt; I need to make a copy of hbase-site.xml into /etc/hive/conf so Hive is able to make the connection to HBase&lt;/EM&gt;&lt;/STRONG&gt; in order to populate my Hive table?&lt;/P&gt;&lt;P&gt;Currently the insert is failing and the log shows that sentence is trying to call a different node instead the one where the tables are located.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks in advance!!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 06 Aug 2024 16:06:11 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391421#M247612</guid>
      <dc:creator>Marks_08</dc:creator>
      <dc:date>2024-08-06T16:06:11Z</dc:date>
    </item>
    <item>
      <title>Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391431#M247619</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/110271"&gt;@Marks_08&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;To insert into a managed table from an external table created with HBase structure in CDP, you need to ensure that Hive can properly connect to HBase. This typically involves making sure that the Hive service is configured to access HBase correctly. One common issue is that the necessary configuration files, such as hbase-site.xml, are not accessible to Hive, leading to connection issues.&lt;/P&gt;&lt;P class="p1"&gt;Here’s what you can do to address this:&lt;/P&gt;&lt;P class="p2"&gt;&lt;STRONG&gt;1. Copy &lt;/STRONG&gt;&lt;SPAN class="s1"&gt;&lt;STRONG&gt;hbase-site.xml&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;STRONG&gt; to Hive Configuration Directory&lt;/STRONG&gt;&lt;/P&gt;&lt;P class="p1"&gt;You need to copy the hbase-site.xml file to the Hive configuration directory. This file contains the necessary configuration for Hive to connect to HBase.&lt;/P&gt;&lt;P class="p1"&gt;sudo cp /etc/hbase/conf/hbase-site.xml /etc/hive/conf/&lt;/P&gt;&lt;P class="p2"&gt;&lt;STRONG&gt;2. Verify HBase Configuration&lt;/STRONG&gt;&lt;/P&gt;&lt;P class="p1"&gt;Ensure that the hbase-site.xml file contains the correct configuration and points to the correct HBase nodes. The key configurations to check are:&lt;/P&gt;&lt;UL class="ul1"&gt;&lt;LI&gt;hbase.zookeeper.quorum&lt;/LI&gt;&lt;LI&gt;hbase.zookeeper.property.clientPort&lt;/LI&gt;&lt;/UL&gt;&lt;P class="p1"&gt;These settings should correctly point to the Zookeeper quorum and client port used by your HBase cluster.&lt;/P&gt;&lt;P class="p2"&gt;&lt;STRONG&gt;3. Restart Hive Service&lt;/STRONG&gt;&lt;/P&gt;&lt;P class="p1"&gt;After copying the hbase-site.xml file, you might need to restart the Hive service to ensure it picks up the new configuration.&lt;/P&gt;&lt;P class="p1"&gt;&lt;STRONG&gt;4. Check Hive and HBase Connectivity&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Make sure that the Hive service can properly communicate with HBase by running a simple query that accesses HBase data through Hive.&lt;/P&gt;&lt;H3&gt;&amp;nbsp;&lt;/H3&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Chethan YM&lt;/P&gt;</description>
      <pubDate>Wed, 07 Aug 2024 05:04:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391431#M247619</guid>
      <dc:creator>ChethanYM</dc:creator>
      <dc:date>2024-08-07T05:04:29Z</dc:date>
    </item>
    <item>
      <title>Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391476#M247644</link>
      <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/75213"&gt;@ChethanYM&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have completed the 4 steps you suggested and still failing.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I also executed:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;with tmp_table as (select * from external_habse_table) select * from tmp_table;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;And tmp_table shows the results, but the issue is persisting the information in a managed table.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any other idea?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 07 Aug 2024 20:30:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391476#M247644</guid>
      <dc:creator>Marks_08</dc:creator>
      <dc:date>2024-08-07T20:30:50Z</dc:date>
    </item>
    <item>
      <title>Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391545#M247668</link>
      <description>&lt;UL&gt;&lt;LI&gt;Is the query failed in compilation stage / execution stage?&lt;/LI&gt;&lt;LI&gt;Could you please share the complete stack-trace of the query failure?&lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Thu, 08 Aug 2024 14:54:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391545#M247668</guid>
      <dc:creator>ggangadharan</dc:creator>
      <dc:date>2024-08-08T14:54:53Z</dc:date>
    </item>
    <item>
      <title>Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391667#M247717</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/92016"&gt;@ggangadharan&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Query is failing in execution stage.&lt;BR /&gt;Here is the complete stack-trace:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Container: container_e56_1723487266861_0004_01_000001 on host11.com:8041&lt;BR /&gt;LogAggregationType: LOCAL&lt;BR /&gt;=============================================================================================&lt;BR /&gt;LogType:syslog_dag_1723487266861_0004_2&lt;BR /&gt;LogLastModifiedTime:Mon Aug 12 13:24:12 -0700 2024&lt;BR /&gt;LogLength:98347&lt;BR /&gt;LogContents:&lt;BR /&gt;2024-08-12 13:23:23,676 [INFO] [IPC Server handler 1 on 46167] |app.DAGAppMaster|: Running DAG: insert into table managed_id_name (sele...a) (Stage-1), callerContext={ context=HIVE, callerType=HIVE_QUERY_ID, callerId=hive_20240812132322_49cb35cb-859c-43a3-9bdc-8d069716bf38 }&lt;BR /&gt;2024-08-12 13:23:23,698 [INFO] [IPC Server handler 1 on 46167] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_SUBMITTED]: dagID=dag_1723487266861_0004_2, submitTime=1723494203674, queueName=default&lt;BR /&gt;2024-08-12 13:23:23,705 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Default container context for vertex_1723487266861_0004_2_00 [Map 1]=LocalResources: [[ name=hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar" } size: 43638473 timestamp: 1723487505772 type: FILE visibility: PRIVATE],[ name=hive-exec-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec-3.1.3000.7.1.7.0-551.jar" } size: 45604768 timestamp: 1723487315123 type: FILE visibility: PRIVATE],[ name=hive-exec.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec.jar" } size: 45604768 timestamp: 1723487315601 type: FILE visibility: PRIVATE],[ name=tez-conf.pb, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9/.tez/application_1723487266861_0004/tez-conf.pb" } size: 137728 timestamp: 1723494184859 type: FILE visibility: APPLICATION],[ name=hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar" } size: 1715959 timestamp: 1723487505901 type: FILE visibility: PRIVATE],[ name=tezlib, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/tez/0.9.1.7.1.7.0-551/tez.tar.gz" } size: 293092553 timestamp: 1691791773812 type: ARCHIVE visibility: PUBLIC],[ name=hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/hive/.hiveJars/hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar" } size: 45604768 timestamp: 1691791898900 type: FILE visibility: PRIVATE],[ name=htrace-core4-4.2.0-incubating.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/htrace-core4-4.2.0-incubating.jar" } size: 1506370 timestamp: 1723487505807 type: FILE visibility: PRIVATE],[ name=hive-hbase-handler-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-hbase-handler-3.1.3000.7.1.7.0-551.jar" } size: 118906 timestamp: 1723487505370 type: FILE visibility: PRIVATE],[ name=hadoop-common-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-common-3.1.1.7.1.7.0-551.jar" } size: 4241824 timestamp: 1723487505866 type: FILE visibility: PRIVATE],], environment: [[ SHELL=/bin/bash ],[ LD_LIBRARY_PATH=$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$HADOOP_COMMON_HOME/lib/native/ ],[ HADOOP_ROOT_LOGGER=INFO,CLA ],[ CLASSPATH=$PWD:$PWD/*:$PWD/tezlib/*:$PWD/tezlib/lib/*: ],], credentials(token kinds): [HBASE_AUTH_TOKEN,HDFS_DELEGATION_TOKEN,tez.job,], javaOpts: -server -Djava.net.preferIPv4Stack=true -XX:+PrintGCDetails -verbose:gc -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp -server -Djava.net.preferIPv4Stack=true -XX:NewRatio=8 -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+PrintGCDetails -verbose:gc -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=&amp;lt;LOG_DIR&amp;gt; -Dtez.root.logger=INFO,CLA , vertex: vertex_1723487266861_0004_2_00 [Map 1], Default Resources=&amp;lt;memory:4096, vCores:1&amp;gt;&lt;BR /&gt;2024-08-12 13:23:23,706 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Setting 1 additional inputs for vertexvertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:23:23,706 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Using ExecutionContext from Vertex for Vertex Map 1&lt;BR /&gt;2024-08-12 13:23:23,706 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Running vertex: vertex_1723487266861_0004_2_00 [Map 1] : TaskScheduler=0:TezYarn, ContainerLauncher=0:TezYarn, TaskCommunicator=0:TezYarn&lt;BR /&gt;2024-08-12 13:23:23,708 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Default container context for vertex_1723487266861_0004_2_01 [Reducer 2]=LocalResources: [[ name=hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar" } size: 43638473 timestamp: 1723487505772 type: FILE visibility: PRIVATE],[ name=hive-exec-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec-3.1.3000.7.1.7.0-551.jar" } size: 45604768 timestamp: 1723487315123 type: FILE visibility: PRIVATE],[ name=hive-exec.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-exec.jar" } size: 45604768 timestamp: 1723487315601 type: FILE visibility: PRIVATE],[ name=tez-conf.pb, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9/.tez/application_1723487266861_0004/tez-conf.pb" } size: 137728 timestamp: 1723494184859 type: FILE visibility: APPLICATION],[ name=hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar" } size: 1715959 timestamp: 1723487505901 type: FILE visibility: PRIVATE],[ name=tezlib, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/tez/0.9.1.7.1.7.0-551/tez.tar.gz" } size: 293092553 timestamp: 1691791773812 type: ARCHIVE visibility: PUBLIC],[ name=hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/user/hive/.hiveJars/hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar" } size: 45604768 timestamp: 1691791898900 type: FILE visibility: PRIVATE],[ name=htrace-core4-4.2.0-incubating.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/htrace-core4-4.2.0-incubating.jar" } size: 1506370 timestamp: 1723487505807 type: FILE visibility: PRIVATE],[ name=hive-hbase-handler-3.1.3000.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hive-hbase-handler-3.1.3000.7.1.7.0-551.jar" } size: 118906 timestamp: 1723487505370 type: FILE visibility: PRIVATE],[ name=hadoop-common-3.1.1.7.1.7.0-551.jar, value=resource { scheme: "hdfs" host: "host14.com" port: 8020 file: "/tmp/hive/hive/_tez_session_dir/3830aac0-c85a-4934-ae3c-242a1ea079e9-resources/hadoop-common-3.1.1.7.1.7.0-551.jar" } size: 4241824 timestamp: 1723487505866 type: FILE visibility: PRIVATE],], environment: [[ SHELL=/bin/bash ],[ LD_LIBRARY_PATH=$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$PWD:$HADOOP_COMMON_HOME/lib/native:$JAVA_LIBRARY_PATH:$PWD:${PARCELS_ROOT}/CDH/lib/hadoop/lib/native:$HADOOP_COMMON_HOME/lib/native/ ],[ HADOOP_ROOT_LOGGER=INFO,CLA ],[ CLASSPATH=$PWD:$PWD/*:$PWD/tezlib/*:$PWD/tezlib/lib/*: ],], credentials(token kinds): [HBASE_AUTH_TOKEN,HDFS_DELEGATION_TOKEN,tez.job,], javaOpts: -server -Djava.net.preferIPv4Stack=true -XX:+PrintGCDetails -verbose:gc -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp -server -Djava.net.preferIPv4Stack=true -XX:NewRatio=8 -XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -XX:+PrintGCDetails -verbose:gc -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=&amp;lt;LOG_DIR&amp;gt; -Dtez.root.logger=INFO,CLA , vertex: vertex_1723487266861_0004_2_01 [Reducer 2], Default Resources=&amp;lt;memory:4096, vCores:1&amp;gt;&lt;BR /&gt;2024-08-12 13:23:23,708 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Setting 1 additional outputs for vertex vertex_1723487266861_0004_2_01 [Reducer 2]&lt;BR /&gt;2024-08-12 13:23:23,708 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Using ExecutionContext from Vertex for Vertex Reducer 2&lt;BR /&gt;2024-08-12 13:23:23,709 [INFO] [IPC Server handler 1 on 46167] |impl.VertexImpl|: Running vertex: vertex_1723487266861_0004_2_01 [Reducer 2] : TaskScheduler=0:TezYarn, ContainerLauncher=0:TezYarn, TaskCommunicator=0:TezYarn&lt;BR /&gt;2024-08-12 13:23:23,713 [INFO] [IPC Server handler 1 on 46167] |impl.DAGImpl|: Using DAG Scheduler: org.apache.tez.dag.app.dag.impl.DAGSchedulerNaturalOrder&lt;BR /&gt;2024-08-12 13:23:23,713 [INFO] [IPC Server handler 1 on 46167] |tez.Utils|: Generating DAG graphviz file, dagId=dag_1723487266861_0004_2, filePath=/yarn/container-logs/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/dag_1723487266861_0004_2_priority.dot&lt;BR /&gt;2024-08-12 13:23:23,714 [INFO] [IPC Server handler 1 on 46167] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_INITIALIZED]: dagID=dag_1723487266861_0004_2, initTime=1723494203714&lt;BR /&gt;2024-08-12 13:23:23,714 [INFO] [IPC Server handler 1 on 46167] |impl.DAGImpl|: dag_1723487266861_0004_2 transitioned from NEW to INITED due to event DAG_INIT&lt;BR /&gt;2024-08-12 13:23:23,714 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Added additional resources : [[]] to classpath&lt;BR /&gt;2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_STARTED]: dagID=dag_1723487266861_0004_2, startTime=1723494203714&lt;BR /&gt;2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: dag_1723487266861_0004_2 transitioned from INITED to RUNNING due to event DAG_START&lt;BR /&gt;2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Root Inputs exist for Vertex: Map 1 : {a={InputName=a}, {Descriptor=ClassName=org.apache.tez.mapreduce.input.MRInputLegacy, hasPayload=true}, {ControllerDescriptor=ClassName=org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator, hasPayload=false}}&lt;BR /&gt;2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Starting root input initializer for input: a, with class: [org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator]&lt;BR /&gt;2024-08-12 13:23:23,715 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Setting vertexManager to RootInputVertexManager for vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:23:23,729 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Num tasks is -1. Expecting VertexManager/InputInitializers/1-1 split to set #tasks for the vertex vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:23:23,729 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Vertex will initialize from input initializer. vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:23:23,729 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Starting 1 inputInitializers for vertex vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:23:23,741 [INFO] [App Shared Pool - #1] |exec.Utilities|: PLAN PATH = hdfs://host14.com:8020/tmp/hive/hive/8a99b093-53f4-455b-8224-5f6355e960ed/hive_2024-08-12_13-23-22_924_2556685397081991603-9/hive/_tez_scratch_dir/2ea04305-4390-4a81-a6c2-8376f90dda7f/map.xml&lt;BR /&gt;2024-08-12 13:23:23,742 [INFO] [App Shared Pool - #1] |exec.SerializationUtilities|: Deserializing MapWork using kryo&lt;BR /&gt;2024-08-12 13:23:23,765 [INFO] [App Shared Pool - #1] |exec.Utilities|: Deserialized plan (via RPC) - name: Map 1 size: 8.28KB&lt;BR /&gt;2024-08-12 13:23:23,765 [INFO] [App Shared Pool - #1] |tez.HiveSplitGenerator|: SplitGenerator using llap affinitized locations: false&lt;BR /&gt;2024-08-12 13:23:23,766 [INFO] [App Shared Pool - #1] |tez.HiveSplitGenerator|: SplitLocationProvider: org.apache.hadoop.hive.ql.exec.tez.Utils$1@7aebc17b&lt;BR /&gt;2024-08-12 13:23:23,767 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_00 [Map 1] transitioned from NEW to INITIALIZING due to event V_INIT&lt;BR /&gt;2024-08-12 13:23:23,767 [INFO] [App Shared Pool - #0] |dag.RootInputInitializerManager|: Starting InputInitializer for Input: a on vertex vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:23:23,768 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Setting vertexManager to ShuffleVertexManager for vertex_1723487266861_0004_2_01 [Reducer 2]&lt;BR /&gt;2024-08-12 13:23:23,768 [INFO] [App Shared Pool - #0] |tez.HiveSplitGenerator|: GenerateConsistentSplitsInHive=true&lt;BR /&gt;2024-08-12 13:23:23,769 [INFO] [App Shared Pool - #0] |tez.HiveSplitGenerator|: The preferred split size is 16777216&lt;BR /&gt;2024-08-12 13:23:23,770 [INFO] [App Shared Pool - #0] |exec.Utilities|: PLAN PATH = hdfs://host14.com:8020/tmp/hive/hive/8a99b093-53f4-455b-8224-5f6355e960ed/hive_2024-08-12_13-23-22_924_2556685397081991603-9/hive/_tez_scratch_dir/2ea04305-4390-4a81-a6c2-8376f90dda7f/map.xml&lt;BR /&gt;2024-08-12 13:23:23,770 [INFO] [App Shared Pool - #0] |exec.Utilities|: Processing alias a&lt;BR /&gt;2024-08-12 13:23:23,770 [INFO] [App Shared Pool - #0] |exec.Utilities|: Adding 1 inputs; the first input is hdfs://host14.com:8020/warehouse/tablespace/external/hive/c_0EXTERNAL_ML_SERDE&lt;BR /&gt;2024-08-12 13:23:23,771 [INFO] [App Shared Pool - #0] |io.HiveInputFormat|: hive.io.file.readcolumn.ids = 0,1,2&lt;BR /&gt;2024-08-12 13:23:23,772 [INFO] [App Shared Pool - #0] |io.HiveInputFormat|: hive.io.file.readcolumn.names = key,cf1_id,cf1_name&lt;BR /&gt;2024-08-12 13:23:23,772 [INFO] [App Shared Pool - #0] |io.HiveInputFormat|: Generating splits for dirs: hdfs://host14.com:8020/warehouse/tablespace/external/hive/c_0EXTERNAL_ML_SERDE&lt;BR /&gt;2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |vertexmanager.ShuffleVertexManagerBase|: Settings minFrac: 0.2 maxFrac: 0.4 auto: false desiredTaskIput: 104857600&lt;BR /&gt;2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |vertexmanager.ShuffleVertexManager|: minTaskParallelism 1&lt;BR /&gt;2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Creating 1 tasks for vertex: vertex_1723487266861_0004_2_01 [Reducer 2]&lt;BR /&gt;2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Directly initializing vertex: vertex_1723487266861_0004_2_01 [Reducer 2]&lt;BR /&gt;2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_CONFIGURE_DONE]: vertexId=vertex_1723487266861_0004_2_01, reconfigureDoneTime=1723494203782, numTasks=1, vertexLocationHint=null, edgeManagersCount=1, rootInputSpecUpdateCount=0, setParallelismCalledFlag=false&lt;BR /&gt;2024-08-12 13:23:23,782 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Setting up committers for vertex vertex_1723487266861_0004_2_01 [Reducer 2], numAdditionalOutputs=1&lt;BR /&gt;2024-08-12 13:23:23,783 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_INITIALIZED]: vertexName=Reducer 2, vertexId=vertex_1723487266861_0004_2_01, initRequestedTime=1723494203767, initedTime=1723494203782, numTasks=1, processorName=org.apache.hadoop.hive.ql.exec.tez.ReduceTezProcessor, additionalInputsCount=0, initGeneratedEventsCount=0, servicePluginInfo=ServicePluginInfo {containerLauncherName=TezYarn, taskSchedulerName=TezYarn, taskCommunicatorName=TezYarn, containerLauncherClassName=org.apache.tez.dag.app.launcher.TezContainerLauncherImpl, taskSchedulerClassName=org.apache.tez.dag.app.rm.YarnTaskSchedulerService, taskCommunicatorClassName=org.apache.tez.dag.app.TezTaskCommunicatorImpl }&lt;BR /&gt;2024-08-12 13:23:23,783 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_01 [Reducer 2] transitioned from NEW to INITED due to event V_INIT&lt;BR /&gt;2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:zookeeper.version=3.5.5-551-0bb5994437a9a16c597ace352f57516f2582d0f9, built on 08/03/2021 19:51 GMT&lt;BR /&gt;2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:host.name=host11.com&lt;BR /&gt;2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.version=1.8.0_231&lt;BR /&gt;2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.vendor=client Corporation&lt;BR /&gt;2024-08-12 13:23:23,891 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.home=/usr/java/jdk1.8.0_231/jre&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.class.path=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hadoop-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-exec.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/htrace-core4-4.2.0-incubating.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-exec-3.1.3000.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-exec-3.1.3000.7.1.7.0-551-ff4237ff03c5fe86dc9ea30fadc41b3b3d73040c9df75cd87a13fa2b20642f4f.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/hive-hbase-handler-3.1.3000.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/hadoop-shim-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-api-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-common-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-runtime-internals-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-runtime-library-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-mapreduce-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-examples-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-dag-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-tests-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-ext-service-tests-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-protobuf-history-plugin-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-history-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-history-with-acls-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-cache-plugin-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-yarn-timeline-history-with-fs-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-history-parser-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-job-analyzer-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/tez-javadoc-tools-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/hadoop-shim-2.8-0.9.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/slf4j-api-1.7.30.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-api-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-lang-2.6.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/guava-28.1-jre.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/failureaccess-1.0.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jsr305-3.0.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/checker-qual-2.8.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/j2objc-annotations-1.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/animal-sniffer-annotations-1.18.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-logging-1.1.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jaxb-api-2.2.11.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/protobuf-java-2.5.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-annotations-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-annotations-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-cli-1.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-math3-3.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/httpclient-4.5.13.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/httpcore-4.4.13.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-codec-1.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-io-2.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-net-3.6.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-collections-3.2.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.servlet-api-3.1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.activation-api-1.2.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-server-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-http-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-util-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-io-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-servlet-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-security-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-util-ajax-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-webapp-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jetty-xml-9.4.39.v20210325.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jersey-json-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jettison-1.3.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jaxb-impl-2.2.3-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-core-asl-1.9.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-jaxrs-1.9.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-xc-1.9.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-mapper-asl-1.9.13-cloudera.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/log4j-1.2.17-cloudera1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-beanutils-1.9.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-configuration2-2.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-lang3-3.8.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/slf4j-log4j12-1.7.30.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/avro-1.8.2.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/paranamer-2.8.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/snappy-java-1.1.7.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-compress-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/xz-1.8.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/re2j-1.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gson-2.2.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/zookeeper-3.5.5.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/zookeeper-jute-3.5.5.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/audience-annotations-0.5.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-all-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-core-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-pkix-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-asn1-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-util-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-databind-2.10.5.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-core-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/stax2-api-3.1.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/woodstox-core-5.0.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-cloud-storage-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-aws-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/aws-java-sdk-bundle-1.11.901.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/wildfly-openssl-1.0.7.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-azure-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/azure-storage-7.0.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/azure-keyvault-core-1.0.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-azure-datalake-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/azure-data-lake-store-sdk-2.3.6.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gcs-connector-2.1.2.7.1.7.0-551-shaded.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/google-extensions-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/flogger-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/checker-compat-qual-2.5.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/flogger-system-backend-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/flogger-slf4j-backend-0.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-cloud-bindings-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-shell-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-util-common-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/gateway-i18n-1.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.activation-1.2.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-groovysh-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-templates-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-xml-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-console-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-swing-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/groovy-json-3.0.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/json-smart-2.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/accessors-smart-1.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/asm-5.0.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-configuration-1.10.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/forbiddenapis-2.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/ranger-raz-hook-abfs-2.1.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/ranger-raz-intg-2.1.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-ozone-filesystem-hadoop3-1.1.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/curator-client-4.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/curator-framework-4.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/curator-recipes-4.3.0.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/error_prone_annotations-2.3.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/stax-api-1.0.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-auth-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/nimbus-jose-jwt-7.9.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jcip-annotations-1.0-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-simplekdc-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-client-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-config-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-common-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-crypto-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-util-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/token-provider-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-admin-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-server-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerb-identity-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/kerby-xdr-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-hdfs-client-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/okhttp-2.7.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/okio-1.6.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jersey-core-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jsr311-api-1.1.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jersey-client-1.19.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/guice-servlet-4.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/guice-4.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/javax.inject-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/aopalliance-1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-module-jaxb-annotations-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jakarta.xml.bind-api-2.3.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jakarta.activation-api-1.2.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-jaxrs-json-provider-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jackson-jaxrs-base-2.10.5.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-client-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-registry-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-daemon-1.0.13.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/dnsjava-2.1.7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/commons-collections4-4.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/leveldbjni-all-1.8.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/RoaringBitmap-0.4.9.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/async-http-client-2.12.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/async-http-client-netty-utils-2.12.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-buffer-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-common-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/jakarta.activation-1.2.2.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-codec-http-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-resolver-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-codec-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-handler-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-codec-socks-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-handler-proxy-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-native-epoll-4.1.60.Final-linux-x86_64.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-native-unix-common-4.1.60.Final.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-transport-native-kqueue-4.1.60.Final-osx-x86_64.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/reactive-streams-1.0.3.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/netty-reactive-streams-2.0.4.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-mapreduce-client-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-web-proxy-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-common-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/ehcache-3.3.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/HikariCP-java7-2.4.12.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/mssql-jdbc-6.2.1.jre7.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/bcpkix-jdk15on-1.60.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/metrics-core-3.1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-applicationhistoryservice-3.1.1.7.1.7.0-551.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/objenesis-1.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/fst-2.50.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/java-util-1.9.0.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/json-io-2.5.1.jar:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tezlib/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.7.1.7.0-551.jar:&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.library.path=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001:/CDH/lib/hadoop/lib/native:/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001:/CDH/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/lib/native/:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.io.tmpdir=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001/tmp&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:java.compiler=&amp;lt;NA&amp;gt;&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.name=Linux&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.arch=amd64&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.version=5.4.17-2136.321.4.1.el8uek.x86_64&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:user.name=hive&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:user.home=/var/lib/hive&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:user.dir=/yarn/nm/usercache/hive/appcache/application_1723487266861_0004/container_e56_1723487266861_0004_01_000001&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.memory.free=1417MB&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.memory.max=1638MB&lt;BR /&gt;2024-08-12 13:23:23,892 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Client environment:os.memory.total=1638MB&lt;BR /&gt;2024-08-12 13:23:23,895 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ZooKeeper|: Initiating client connection, connectString=host14.com:2181,host11.com:2181,host12.com:2181 sessionTimeout=90000 watcher=org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient$$Lambda$95/1871934544@473d881c&lt;BR /&gt;2024-08-12 13:23:23,898 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |common.X509Util|: Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation&lt;BR /&gt;2024-08-12 13:23:23,901 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ClientCnxnSocket|: jute.maxbuffer value is 4194304 Bytes&lt;BR /&gt;2024-08-12 13:23:23,907 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c] |zookeeper.ClientCnxn|: zookeeper.request.timeout value is 0. feature enabled=&lt;BR /&gt;2024-08-12 13:23:23,915 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c-SendThread(host14.com:2181)] |zookeeper.ClientCnxn|: Opening socket connection to server host14.com/10.243.11.186:2181. Will not attempt to authenticate using SASL (unknown error)&lt;BR /&gt;2024-08-12 13:23:23,915 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c-SendThread(host14.com:2181)] |zookeeper.ClientCnxn|: Socket connection established, initiating session, client: /10.243.11.183:52366, server: host14.com/10.243.11.186:2181&lt;BR /&gt;2024-08-12 13:23:23,920 [INFO] [ReadOnlyZKClient-host14.com:2181,host11.com:2181,host12.com:2181@0x5d00b21c-SendThread(host14.com:2181)] |zookeeper.ClientCnxn|: Session establishment complete on server host14.com/10.243.11.186:2181, sessionid = 0x10762717081009c, negotiated timeout = 60000&lt;BR /&gt;2024-08-12 13:23:24,102 [INFO] [App Shared Pool - #0] |mapreduce.RegionSizeCalculator|: Calculating region sizes for table "external_ml_serde".&lt;BR /&gt;2024-08-12 13:23:28,718 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=6, retries=16, started=4467 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see &lt;A href="https://s.apache.org/timeout" target="_blank"&gt;https://s.apache.org/timeout&lt;/A&gt;&lt;BR /&gt;2024-08-12 13:23:32,738 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=7, retries=16, started=8487 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see &lt;A href="https://s.apache.org/timeout" target="_blank"&gt;https://s.apache.org/timeout&lt;/A&gt;&lt;BR /&gt;2024-08-12 13:23:33,545 [INFO] [DelayedContainerManager] |rm.YarnTaskSchedulerService|: No taskRequests. Container's idle timeout delay expired or is new. Releasing container, containerId=container_e56_1723487266861_0004_01_000002, containerExpiryTime=1723494213374, idleTimeout=10000, taskRequestsCount=0, heldContainers=1, delayedContainers=0, isNew=false&lt;BR /&gt;2024-08-12 13:23:33,548 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:CONTAINER_STOPPED]: containerId=container_e56_1723487266861_0004_01_000002, stoppedTime=1723494213548, exitStatus=0&lt;BR /&gt;2024-08-12 13:23:33,549 [INFO] [ContainerLauncher #1] |launcher.TezContainerLauncherImpl|: Stopping container_e56_1723487266861_0004_01_000002&lt;BR /&gt;2024-08-12 13:23:33,792 [INFO] [Dispatcher thread {Central}] |container.AMContainerImpl|: Container container_e56_1723487266861_0004_01_000002 exited with diagnostics set to Container failed, exitCode=-105. [2024-08-12 13:23:33.560]Container killed by the ApplicationMaster.&lt;BR /&gt;[2024-08-12 13:23:33.574]Container killed on request. Exit code is 143&lt;BR /&gt;[2024-08-12 13:23:33.574]Container exited with a non-zero exit code 143.&lt;/P&gt;&lt;P&gt;2024-08-12 13:23:34,545 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: &amp;lt;memory:0, vCores:0&amp;gt; Free: &amp;lt;memory:664576, vCores:263&amp;gt; pendingRequests: 0 delayedContainers: 0 heartbeats: 101 lastPreemptionHeartbeat: 100&lt;BR /&gt;2024-08-12 13:23:42,795 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=8, retries=16, started=18544 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see &lt;A href="https://s.apache.org/timeout" target="_blank"&gt;https://s.apache.org/timeout&lt;/A&gt;&lt;BR /&gt;2024-08-12 13:23:47,094 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: &amp;lt;memory:0, vCores:0&amp;gt; Free: &amp;lt;memory:664576, vCores:263&amp;gt; pendingRequests: 0 delayedContainers: 0 heartbeats: 151 lastPreemptionHeartbeat: 150&lt;BR /&gt;2024-08-12 13:23:52,848 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=9, retries=16, started=28597 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see &lt;A href="https://s.apache.org/timeout" target="_blank"&gt;https://s.apache.org/timeout&lt;/A&gt;&lt;BR /&gt;2024-08-12 13:23:59,644 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: &amp;lt;memory:0, vCores:0&amp;gt; Free: &amp;lt;memory:664576, vCores:263&amp;gt; pendingRequests: 0 delayedContainers: 0 heartbeats: 201 lastPreemptionHeartbeat: 200&lt;BR /&gt;2024-08-12 13:24:02,884 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=10, retries=16, started=38633 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see &lt;A href="https://s.apache.org/timeout" target="_blank"&gt;https://s.apache.org/timeout&lt;/A&gt;&lt;BR /&gt;2024-08-12 13:24:12,195 [INFO] [AMRM Callback Handler Thread] |rm.YarnTaskSchedulerService|: Allocated: &amp;lt;memory:0, vCores:0&amp;gt; Free: &amp;lt;memory:664576, vCores:263&amp;gt; pendingRequests: 0 delayedContainers: 0 heartbeats: 251 lastPreemptionHeartbeat: 250&lt;BR /&gt;2024-08-12 13:24:12,895 [INFO] [hconnection-0x7093268-shared-pool3-t1] |client.RpcRetryingCallerImpl|: Call exception, tries=11, retries=16, started=48644 ms ago, cancelled=false, msg=Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1, see &lt;A href="https://s.apache.org/timeout" target="_blank"&gt;https://s.apache.org/timeout&lt;/A&gt;&lt;BR /&gt;2024-08-12 13:24:12,940 [ERROR] [App Shared Pool - #0] |io.HiveInputFormat|: Failed; user hive tokens Kind: HBASE_AUTH_TOKEN, Service: 73f1f19c-6191-4a6f-81a0-8cfe6a5876b7, Ident: ((username=hive/host14.com@US.client.COM, keyId=361, issueDate=1723494203628, expirationDate=1724099003628, sequenceNumber=12)), Kind: HDFS_DELEGATION_TOKEN, Service: 10.243.11.186:8020, Ident: (token for hive: HDFS_DELEGATION_TOKEN owner=hive/host14.com@US.client.COM, renewer=yarn, realUser=, issueDate=1723494184828, maxDate=1724098984828, sequenceNumber=27767, masterKeyId=357), Kind: tez.job, Service: application_1723487266861_0004, Ident: 1e 61 70 70 6c 69 63 61 74 69 6f 6e 5f 31 37 32 33 34 38 37 32 36 36 38 36 31 5f 30 30 30 34,&lt;BR /&gt;2024-08-12 13:24:12,940 [INFO] [App Shared Pool - #0] |dag.RootInputInitializerManager|: Failed InputInitializer for Input: a on vertex vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;2024-08-12 13:24:12,942 [ERROR] [Dispatcher thread {Central}] |impl.VertexImpl|: Vertex Input: a initializer failed, vertex=vertex_1723487266861_0004_2_00 [Map 1]&lt;BR /&gt;org.apache.tez.dag.app.dag.impl.AMUserCodeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:&lt;BR /&gt;2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;/P&gt;&lt;P&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:188)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$createAndStartInitializing$2(RootInputInitializerManager.java:171)&lt;BR /&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;BR /&gt;at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)&lt;BR /&gt;at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)&lt;BR /&gt;at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:748)&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:&lt;BR /&gt;2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:800)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:768)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:721)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:716)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:114)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:78)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.getRegionServersOfTable(RegionSizeCalculator.java:103)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.init(RegionSizeCalculator.java:79)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.&amp;lt;init&amp;gt;(RegionSizeCalculator.java:61)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRegionSizeCalculator(TableInputFormatBase.java:593)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.oneInputSplitPerRegion(TableInputFormatBase.java:294)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:257)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:349)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.access$200(HiveHBaseTableInputFormat.java:68)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:271)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:269)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:269)&lt;BR /&gt;at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:542)&lt;BR /&gt;at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:850)&lt;BR /&gt;at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:250)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$runInitializer$3(RootInputInitializerManager.java:203)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializer(RootInputInitializerManager.java:196)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:177)&lt;BR /&gt;... 8 more&lt;BR /&gt;Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)&lt;BR /&gt;... 3 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:389)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:354)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:819)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)&lt;BR /&gt;... 1 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed&lt;BR /&gt;... 26 more&lt;BR /&gt;2024-08-12 13:24:12,951 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_FINISHED]: vertexName=Map 1, vertexId=vertex_1723487266861_0004_2_00, initRequestedTime=1723494203715, initedTime=0, startRequestedTime=1723494203767, startedTime=0, finishTime=1723494252949, timeTaken=1723494252949, status=FAILED, diagnostics=Vertex vertex_1723487266861_0004_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE&lt;BR /&gt;Vertex Input: a initializer failed, vertex=vertex_1723487266861_0004_2_00 [Map 1], org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:&lt;BR /&gt;2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:800)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:768)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:721)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:716)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:114)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:78)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.getRegionServersOfTable(RegionSizeCalculator.java:103)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.init(RegionSizeCalculator.java:79)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.&amp;lt;init&amp;gt;(RegionSizeCalculator.java:61)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRegionSizeCalculator(TableInputFormatBase.java:593)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.oneInputSplitPerRegion(TableInputFormatBase.java:294)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:257)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:349)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.access$200(HiveHBaseTableInputFormat.java:68)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:271)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:269)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:269)&lt;BR /&gt;at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:542)&lt;BR /&gt;at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:850)&lt;BR /&gt;at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:250)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$runInitializer$3(RootInputInitializerManager.java:203)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializer(RootInputInitializerManager.java:196)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:177)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$createAndStartInitializing$2(RootInputInitializerManager.java:171)&lt;BR /&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;BR /&gt;at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)&lt;BR /&gt;at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)&lt;BR /&gt;at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:748)&lt;BR /&gt;Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)&lt;BR /&gt;... 3 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:389)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:354)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:819)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)&lt;BR /&gt;... 1 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed&lt;BR /&gt;... 26 more&lt;BR /&gt;, counters=Counters: 0, vertexStats=firstTaskStartTime=-1, firstTasksToStart=[ ], lastTaskFinishTime=-1, lastTasksToFinish=[ ], minTaskDuration=-1, maxTaskDuration=-1, avgTaskDuration=-1.0, numSuccessfulTasks=0, shortestDurationTasks=[ ], longestDurationTasks=[ ], vertexTaskStats={numFailedTaskAttempts=0, numKilledTaskAttempts=0, numCompletedTasks=0, numSucceededTasks=0, numKilledTasks=0, numFailedTasks=0}, servicePluginInfo=ServicePluginInfo {containerLauncherName=TezYarn, taskSchedulerName=TezYarn, taskCommunicatorName=TezYarn, containerLauncherClassName=org.apache.tez.dag.app.launcher.TezContainerLauncherImpl, taskSchedulerClassName=org.apache.tez.dag.app.rm.YarnTaskSchedulerService, taskCommunicatorClassName=org.apache.tez.dag.app.TezTaskCommunicatorImpl }&lt;BR /&gt;2024-08-12 13:24:12,952 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_00 [Map 1] transitioned from INITIALIZING to FAILED due to event V_ROOT_INPUT_FAILED&lt;BR /&gt;2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Vertex vertex_1723487266861_0004_2_00 [Map 1] completed., numCompletedVertices=1, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=0, numVertices=2&lt;BR /&gt;2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Checking vertices for DAG completion, numCompletedVertices=1, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=0, numVertices=2, commitInProgress=0, terminationCause=VERTEX_FAILURE&lt;BR /&gt;2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: dag_1723487266861_0004_2 transitioned from RUNNING to TERMINATING due to event DAG_VERTEX_COMPLETED&lt;BR /&gt;2024-08-12 13:24:12,953 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: Invoking committer abort for vertex, vertexId=vertex_1723487266861_0004_2_01 [Reducer 2]&lt;BR /&gt;2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:VERTEX_FINISHED]: vertexName=Reducer 2, vertexId=vertex_1723487266861_0004_2_01, initRequestedTime=1723494203767, initedTime=1723494203782, startRequestedTime=0, startedTime=0, finishTime=1723494252953, timeTaken=1723494252953, status=KILLED, diagnostics=Vertex received Kill in INITED state.&lt;BR /&gt;Vertex vertex_1723487266861_0004_2_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE, counters=Counters: 0, vertexStats=firstTaskStartTime=0, firstTasksToStart=[ task_1723487266861_0004_2_01_000000 ], lastTaskFinishTime=-1, lastTasksToFinish=[ ], minTaskDuration=-1, maxTaskDuration=-1, avgTaskDuration=-1.0, numSuccessfulTasks=0, shortestDurationTasks=[ ], longestDurationTasks=[ ], vertexTaskStats={numFailedTaskAttempts=0, numKilledTaskAttempts=0, numCompletedTasks=0, numSucceededTasks=0, numKilledTasks=0, numFailedTasks=0}, servicePluginInfo=ServicePluginInfo {containerLauncherName=TezYarn, taskSchedulerName=TezYarn, taskCommunicatorName=TezYarn, containerLauncherClassName=org.apache.tez.dag.app.launcher.TezContainerLauncherImpl, taskSchedulerClassName=org.apache.tez.dag.app.rm.YarnTaskSchedulerService, taskCommunicatorClassName=org.apache.tez.dag.app.TezTaskCommunicatorImpl }&lt;BR /&gt;2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.VertexImpl|: vertex_1723487266861_0004_2_01 [Reducer 2] transitioned from INITED to KILLED due to event V_TERMINATE&lt;BR /&gt;2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Vertex vertex_1723487266861_0004_2_01 [Reducer 2] completed., numCompletedVertices=2, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=1, numVertices=2&lt;BR /&gt;2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: Checking vertices for DAG completion, numCompletedVertices=2, numSuccessfulVertices=0, numFailedVertices=1, numKilledVertices=1, numVertices=2, commitInProgress=0, terminationCause=VERTEX_FAILURE&lt;BR /&gt;2024-08-12 13:24:12,955 [INFO] [Dispatcher thread {Central}] |impl.DAGImpl|: DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1&lt;BR /&gt;2024-08-12 13:24:12,959 [INFO] [Dispatcher thread {Central}] |recovery.RecoveryService|: DAG completed, dagId=dag_1723487266861_0004_2, queueSize=0&lt;BR /&gt;2024-08-12 13:24:12,969 [INFO] [Dispatcher thread {Central}] |HistoryEventHandler.criticalEvents|: [HISTORY][DAG:dag_1723487266861_0004_2][Event:DAG_FINISHED]: dagId=dag_1723487266861_0004_2, startTime=1723494203714, finishTime=1723494252955, timeTaken=49241, status=FAILED, diagnostics=Vertex failed, vertexName=Map 1, vertexId=vertex_1723487266861_0004_2_00, diagnostics=[Vertex vertex_1723487266861_0004_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: a initializer failed, vertex=vertex_1723487266861_0004_2_00 [Map 1], org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=16, exceptions:&lt;BR /&gt;2024-08-12T20:24:12.897Z, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:299)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:251)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:800)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:768)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:721)&lt;BR /&gt;at org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:716)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:114)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:78)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.getRegionServersOfTable(RegionSizeCalculator.java:103)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.init(RegionSizeCalculator.java:79)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.RegionSizeCalculator.&amp;lt;init&amp;gt;(RegionSizeCalculator.java:61)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRegionSizeCalculator(TableInputFormatBase.java:593)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.oneInputSplitPerRegion(TableInputFormatBase.java:294)&lt;BR /&gt;at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:257)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:349)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.access$200(HiveHBaseTableInputFormat.java:68)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:271)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$2.run(HiveHBaseTableInputFormat.java:269)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:269)&lt;BR /&gt;at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:542)&lt;BR /&gt;at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:850)&lt;BR /&gt;at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:250)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$runInitializer$3(RootInputInitializerManager.java:203)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializer(RootInputInitializerManager.java:196)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInitializerAndProcessResult(RootInputInitializerManager.java:177)&lt;BR /&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.lambda$createAndStartInitializing$2(RootInputInitializerManager.java:171)&lt;BR /&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;BR /&gt;at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)&lt;BR /&gt;at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)&lt;BR /&gt;at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:748)&lt;BR /&gt;Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68694: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed row 'external_ml_serde,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=host12.com,16020,1723487262759, seqNum=-1&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)&lt;BR /&gt;... 3 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to host12.com/xx.xxx.xx.xxx:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:383)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:91)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:414)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:389)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:354)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:819)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)&lt;BR /&gt;at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)&lt;BR /&gt;... 1 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed&lt;BR /&gt;... 26 more&lt;BR /&gt;]&lt;BR /&gt;Vertex killed, vertexName=Reducer 2, vertexId=vertex_1723487266861_0004_2_01, diagnostics=[Vertex received Kill in INITED state., Vertex vertex_1723487266861_0004_2_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]&lt;BR /&gt;DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1, counters=Counters: 2, org.apache.tez.common.counters.DAGCounter, AM_CPU_MILLISECONDS=4150, AM_GC_TIME_MILLIS=55&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2024 20:38:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391667#M247717</guid>
      <dc:creator>Marks_08</dc:creator>
      <dc:date>2024-08-12T20:38:22Z</dc:date>
    </item>
    <item>
      <title>Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391692#M247728</link>
      <description>&lt;P&gt;From the Error Stack-trace , it does looks like problem is ,&amp;nbsp;connection closed exception between Tez AM application and the HBase server (host12.com).&lt;BR /&gt;&lt;BR /&gt;This connection closure led to a java.net.SocketTimeoutException. In simpler terms, tez AM tried to communicate with HBase but the connection timed out after 60 seconds (callTimeout) because it didn't receive a response within that time frame.&lt;BR /&gt;&lt;BR /&gt;This issue occurred during the initialization of a vertex named "vertex_1723487266861_0004_2_00" within the Tez application.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;Possible Reasons&lt;BR /&gt;&lt;BR /&gt;&lt;/STRONG&gt;The HBase server might be overloaded, experiencing internal issues, or even down entirely.&lt;BR /&gt;Hive might have an incorrect HBase server address or port configured.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV class="container"&gt;&amp;nbsp;&lt;/DIV&gt;</description>
      <pubDate>Tue, 13 Aug 2024 11:37:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391692#M247728</guid>
      <dc:creator>ggangadharan</dc:creator>
      <dc:date>2024-08-13T11:37:41Z</dc:date>
    </item>
    <item>
      <title>Re: INSERT OVERWRITE from HBase External Table into Hive Managed Table.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391726#M247745</link>
      <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/92016"&gt;@ggangadharan&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;As far as I can see HBase is up and running but I found something in the HBase log:&lt;BR /&gt;&lt;BR /&gt;2024-08-13 21:53:30,583 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for hive/HOST@REALM (auth:KERBEROS)&lt;BR /&gt;2024-08-13 21:53:30,584 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Connection from xx.xxx.xx.xxx:55106, version=2.2.3.7.1.7.0-551, sasl=true, ugi=hive/HOST@REALM (auth:KERBEROS), service=ClientService&lt;BR /&gt;2024-08-13 21:53:30,584 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for hive/HOST@REALM (auth:KERBEROS) for protocol=interface org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingInterface&lt;BR /&gt;2024-08-13 21:53:38,853 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39718&lt;BR /&gt;2024-08-13 21:53:38,853 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39718&lt;BR /&gt;2024-08-13 21:53:39,056 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39720&lt;BR /&gt;2024-08-13 21:53:39,056 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39720&lt;BR /&gt;2024-08-13 21:53:39,361 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39722&lt;BR /&gt;2024-08-13 21:53:39,361 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39722&lt;BR /&gt;2024-08-13 21:53:39,869 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39724&lt;BR /&gt;2024-08-13 21:53:39,870 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39724&lt;BR /&gt;2024-08-13 21:53:40,877 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39726&lt;BR /&gt;2024-08-13 21:53:40,877 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39726&lt;BR /&gt;2024-08-13 21:53:42,882 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39728&lt;BR /&gt;2024-08-13 21:53:42,882 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39728&lt;BR /&gt;2024-08-13 21:53:46,219 INFO org.apache.hadoop.hbase.io.hfile.LruBlockCache: totalSize=9.18 MB, freeSize=12.20 GB, max=12.21 GB, blockCount=5, accesses=7481, hits=7461, hitRatio=99.73%, , cachingAccesses=7469, cachingHits=7461, cachingHitsRatio=99.89%, evictions=2009, evicted=0, evictedPerRun=0.0&lt;BR /&gt;2024-08-13 21:53:46,914 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39730&lt;BR /&gt;2024-08-13 21:53:46,914 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39730&lt;BR /&gt;2024-08-13 21:53:50,477 INFO org.apache.hadoop.hbase.ScheduledChore: CompactionThroughputTuner average execution time: 8653 ns.&lt;BR /&gt;2024-08-13 21:53:50,572 INFO org.apache.hadoop.hbase.replication.regionserver.Replication: Global stats: WAL Edits Buffer Used=0B, Limit=268435456B&lt;/P&gt;&lt;P&gt;2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for hbase/HOST@REALM (auth:KERBEROS)&lt;BR /&gt;2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Connection from xx.xxx.xx.xxx:55174, version=2.2.3.7.1.7.0-551, sasl=true, ugi=hbase/HOST@REALM (auth:KERBEROS), service=ClientService&lt;BR /&gt;2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for hbase/HOST@REALM (auth:KERBEROS) for protocol=interface org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingInterface&lt;BR /&gt;2024-08-13 21:53:56,136 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping HBase metrics system...&lt;BR /&gt;2024-08-13 21:53:56,136 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system stopped.&lt;BR /&gt;2024-08-13 21:53:56,638 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties&lt;BR /&gt;2024-08-13 21:53:56,641 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).&lt;BR /&gt;2024-08-13 21:53:56,641 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system started&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;This Warning (WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39730) only appears for the statement:&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;insert overwrite table managed_ml&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;STRONG&gt;select key, cf1_id , cf1_name from c_0external_ml;&lt;/STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;Others statements like &lt;STRONG&gt;insert into&amp;nbsp;c_0external_ml values (1,2,3); &lt;/STRONG&gt;runs perfectly.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Does this error sound familiar to you???&lt;/P&gt;</description>
      <pubDate>Wed, 14 Aug 2024 05:04:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/INSERT-OVERWRITE-from-HBase-External-Table-into-Hive-Managed/m-p/391726#M247745</guid>
      <dc:creator>Marks_08</dc:creator>
      <dc:date>2024-08-14T05:04:15Z</dc:date>
    </item>
  </channel>
</rss>

