<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Getting error during batch execution of records in Hive in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394580#M248747</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/118157"&gt;@Arathi&lt;/a&gt;&amp;nbsp;Welcome to the Cloudera Community!&lt;BR /&gt;&lt;BR /&gt;To help you get the best possible solution, I have tagged our Hive experts&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/38161"&gt;@cravani&lt;/a&gt;&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70785"&gt;@Shmoo&lt;/a&gt;&amp;nbsp; who may be able to assist you further.&lt;BR /&gt;&lt;BR /&gt;Please keep us updated on your post, and we hope you find a satisfactory solution to your query.&lt;/P&gt;</description>
    <pubDate>Fri, 04 Oct 2024 16:08:59 GMT</pubDate>
    <dc:creator>DianaTorres</dc:creator>
    <dc:date>2024-10-04T16:08:59Z</dc:date>
    <item>
      <title>Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394567#M248745</link>
      <description>&lt;P&gt;&lt;SPAN&gt;CDICO2024E: Batch of 5 records could not be written to the data source: [SQLSTATE HY000] c error: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (error code: DATA_IO_ERROR)&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 04 Oct 2024 13:07:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394567#M248745</guid>
      <dc:creator>Arathi</dc:creator>
      <dc:date>2024-10-04T13:07:08Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394580#M248747</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/118157"&gt;@Arathi&lt;/a&gt;&amp;nbsp;Welcome to the Cloudera Community!&lt;BR /&gt;&lt;BR /&gt;To help you get the best possible solution, I have tagged our Hive experts&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/38161"&gt;@cravani&lt;/a&gt;&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70785"&gt;@Shmoo&lt;/a&gt;&amp;nbsp; who may be able to assist you further.&lt;BR /&gt;&lt;BR /&gt;Please keep us updated on your post, and we hope you find a satisfactory solution to your query.&lt;/P&gt;</description>
      <pubDate>Fri, 04 Oct 2024 16:08:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394580#M248747</guid>
      <dc:creator>DianaTorres</dc:creator>
      <dc:date>2024-10-04T16:08:59Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394631#M248770</link>
      <description>&lt;UL&gt;&lt;LI&gt;The query seems to have failed during the compilation phase.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Error while compiling statement&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;At the same time notice below as well ,&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;This error typically occurs when one of the child tasks fails.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask &lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;To investigate further, please provide the beeline console output and complete stack-trace from HS2 logs if the failure occurred during the compilation phase.&lt;/LI&gt;&lt;LI&gt;If the failure occurred on the YARN side, please share the complete stack trace from the child task along with the beeline console output for additional assistance.&lt;/LI&gt;&lt;LI&gt;Additionally, please include the DDL of the associated tables.&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Mon, 07 Oct 2024 07:17:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394631#M248770</guid>
      <dc:creator>ggangadharan</dc:creator>
      <dc:date>2024-10-07T07:17:52Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394766#M248797</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;2024-10-04 11:44:39.765: [TargetConnector.0][hive] CDICO2024E: Batch of 5 records could not be written to the data source: [SQLSTATE HY000] c error: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;BR /&gt;com.ibm.connect.api.SQLErrorException: CDICO2024E: Batch of 5 records could not be written to the data source: [SQLSTATE HY000] c error: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (error code: DATA_IO_ERROR)&lt;BR /&gt;at com.ibm.connect.base.util.Utils.toSQLErrorException(Utils.java:331)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcOutputInteraction.handleAndThrowBatchError(AbstractJdbcOutputInteraction.java:1142)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcOutputInteraction.flushBatch(AbstractJdbcOutputInteraction.java:1025)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcConnector.commit(AbstractJdbcConnector.java:598)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.lambda$performLastCommit$1(ConnectorWriteStep.java:641)&lt;BR /&gt;at java.base/java.lang.Iterable.forEach(Iterable.java:75)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.performLastCommit(ConnectorWriteStep.java:640)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.writePredefinedRecords(ConnectorWriteStep.java:515)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.writePredefined(ConnectorWriteStep.java:1021)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.writeRecords(ConnectorWriteStep.java:1032)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.run(ConnectorWriteStep.java:382)&lt;BR /&gt;at com.ibm.connect.test.Test.run(Test.java:504)&lt;BR /&gt;at com.ibm.connect.test.TestRun.runTest(TestRun.java:158)&lt;BR /&gt;at com.ibm.connect.test.Framework.run(Framework.java:638)&lt;BR /&gt;at com.ibm.connect.test.Framework.exec(Framework.java:192)&lt;BR /&gt;at com.ibm.connect.test.Framework.main(Framework.java:115)&lt;BR /&gt;at hive.scapi.kerberos.Test003_KerberosConnection.test(Test003_KerberosConnection.java:9)&lt;BR /&gt;at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)&lt;BR /&gt;at java.base/java.lang.reflect.Method.invoke(Method.java:586)&lt;BR /&gt;at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)&lt;BR /&gt;at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)&lt;BR /&gt;at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)&lt;BR /&gt;at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)&lt;BR /&gt;at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)&lt;BR /&gt;at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)&lt;BR /&gt;at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)&lt;BR /&gt;at java.base/java.lang.Thread.run(Thread.java:1595)&lt;BR /&gt;Caused by: com.ibm.connect.driver.jdbc.hive.base.c: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;BR /&gt;at com.ibm.connect.driver.jdbc.hive.base.BasePreparedStatement.executeBatch(|Hive|6.0.1.1383|:777)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcOutputInteraction.flushBatch(AbstractJdbcOutputInteraction.java:993)&lt;BR /&gt;... 24 more&lt;BR /&gt;2024-10-04 11:44:39.769: [TargetConnector.0][hive] CDICO2024E: Batch of 5 records could not be written to the data source: [SQLSTATE HY000] c error: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;BR /&gt;com.ibm.connect.api.SQLErrorException: CDICO2024E: Batch of 5 records could not be written to the data source: [SQLSTATE HY000] c error: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (error code: DATA_IO_ERROR)&lt;BR /&gt;at com.ibm.connect.base.util.Utils.toSQLErrorException(Utils.java:331)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcOutputInteraction.handleAndThrowBatchError(AbstractJdbcOutputInteraction.java:1142)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcOutputInteraction.flushBatch(AbstractJdbcOutputInteraction.java:1025)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcConnector.commit(AbstractJdbcConnector.java:598)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.lambda$performLastCommit$1(ConnectorWriteStep.java:641)&lt;BR /&gt;at java.base/java.lang.Iterable.forEach(Iterable.java:75)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.performLastCommit(ConnectorWriteStep.java:640)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.writePredefinedRecords(ConnectorWriteStep.java:515)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.writePredefined(ConnectorWriteStep.java:1021)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.writeRecords(ConnectorWriteStep.java:1032)&lt;BR /&gt;at com.ibm.connect.test.steps.ConnectorWriteStep.run(ConnectorWriteStep.java:382)&lt;BR /&gt;at com.ibm.connect.test.Test.run(Test.java:504)&lt;BR /&gt;at com.ibm.connect.test.TestRun.runTest(TestRun.java:158)&lt;BR /&gt;at com.ibm.connect.test.Framework.run(Framework.java:638)&lt;BR /&gt;at com.ibm.connect.test.Framework.exec(Framework.java:192)&lt;BR /&gt;at com.ibm.connect.test.Framework.main(Framework.java:115)&lt;BR /&gt;at hive.scapi.kerberos.Test003_KerberosConnection.test(Test003_KerberosConnection.java:9)&lt;BR /&gt;at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)&lt;BR /&gt;at java.base/java.lang.reflect.Method.invoke(Method.java:586)&lt;BR /&gt;at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)&lt;BR /&gt;at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)&lt;BR /&gt;at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)&lt;BR /&gt;at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)&lt;BR /&gt;at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)&lt;BR /&gt;at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)&lt;BR /&gt;at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)&lt;BR /&gt;at java.base/java.lang.Thread.run(Thread.java:1595)&lt;BR /&gt;Caused by: com.ibm.connect.driver.jdbc.hive.base.c: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;BR /&gt;at com.ibm.connect.driver.jdbc.hive.base.BasePreparedStatement.executeBatch(|Hive|6.0.1.1383|:777)&lt;BR /&gt;at com.ibm.connect.jdbc.AbstractJdbcOutputInteraction.flushBatch(AbstractJdbcOutputInteraction.java:993)&lt;BR /&gt;... 24 more&lt;BR /&gt;2024-10-04 11:44:39.772: Exception encountered while writing records: com.ibm.connect.api.SQLErrorException: CDICO2024E: Batch of 5 records could not be written to the data source: [SQLSTATE HY000] c error: [IBM][Hive JDBC Driver][Hive]Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (error code: DATA_IO_ERROR)&lt;/P&gt;</description>
      <pubDate>Tue, 08 Oct 2024 15:05:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394766#M248797</guid>
      <dc:creator>testjo</dc:creator>
      <dc:date>2024-10-08T15:05:31Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394774#M248798</link>
      <description>&lt;P&gt;DDL:&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Table creation statement&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;- CREATE TABLE&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;TAB_SCAPI_DIGNASMBP_1728048648203(C1&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;INT,&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;C2&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;VARCHAR(20)) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '&lt;BR /&gt;' STORED AS TEXTFILE&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Prepared statement for insertion&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;- INSERT INTO&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;TAB_SCAPI_DIGNASMBP_1728048648203&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;(C1,&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;C2) VALUES (?, ?)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Values inserting&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;-&lt;BR /&gt;1|VALUE ONE;&lt;BR /&gt;2|VALUE TWO;&lt;BR /&gt;3|NULL;&lt;BR /&gt;4|VALUE FOUR;&lt;BR /&gt;5|VALUE FIVE;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Upon executing the command "&lt;SPAN&gt;INSERT INTO TAB_SCAPI_DIGNASMBP_1728048648203 (C1, C2) VALUES (3, NULL);&lt;BR /&gt;we are receiving following error:&lt;BR /&gt;org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [1] [08S01]: Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;BR /&gt;at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)&lt;BR /&gt;at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:582)&lt;BR /&gt;at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$1(SQLQueryJob.java:491)&lt;BR /&gt;at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:501)&lt;BR /&gt;at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:934)&lt;BR /&gt;at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3940)&lt;BR /&gt;at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:123)&lt;BR /&gt;at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:190)&lt;BR /&gt;at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:121)&lt;BR /&gt;at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:5142)&lt;BR /&gt;at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)&lt;BR /&gt;at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)&lt;BR /&gt;Caused by: java.sql.SQLException: Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;BR /&gt;at org.apache.hive.jdbc.HiveStatement.waitForOperationToComplete(HiveStatement.java:354)&lt;BR /&gt;at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:245)&lt;BR /&gt;at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)&lt;BR /&gt;at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:131)&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 08 Oct 2024 15:54:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394774#M248798</guid>
      <dc:creator>testjo</dc:creator>
      <dc:date>2024-10-08T15:54:06Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394807#M248806</link>
      <description>&lt;P&gt;&lt;SPAN&gt;According to the requirement, please find below the insert statement.&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INSERT INTO TAB_SCAPI_DIGNASMBP_1728048648203 (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;");&lt;/LI-CODE&gt;&lt;P&gt;&lt;SPAN&gt;Kindly review the results after executing the above insert statement.&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INFO  : Compiling command(queryId=hive_20241009052425_caec4881-3c5a-4a18-9599-b20c368de25d): INSERT INTO TAB_SCAPI_DIGNASMBP_1728048648203 (C1, C2) VALUES
(1,"|VALUE ONE;"),
(2,"|VALUE TWO;"),
(3,"|NULL;"),
(4,"|VALUE FOUR;"),
(5,"|VALUE FIVE;")
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null), FieldSchema(name:_col1, type:varchar(20), comment:null)], properties:null)
INFO  : Completed compiling command(queryId=hive_20241009052425_caec4881-3c5a-4a18-9599-b20c368de25d); Time taken: 1.655 seconds
INFO  : Executing command(queryId=hive_20241009052425_caec4881-3c5a-4a18-9599-b20c368de25d): INSERT INTO TAB_SCAPI_DIGNASMBP_1728048648203 (C1, C2) VALUES
(1,"|VALUE ONE;"),
(2,"|VALUE TWO;"),
(3,"|NULL;"),
(4,"|VALUE FOUR;"),
(5,"|VALUE FIVE;")
INFO  : Query ID = hive_20241009052425_caec4881-3c5a-4a18-9599-b20c368de25d
INFO  : Total jobs = 1
INFO  : Launching Job 1 out of 1
INFO  : Starting task [Stage-1:MAPRED] in serial mode
INFO  : Subscribed to counters: [] for queryId: hive_20241009052425_caec4881-3c5a-4a18-9599-b20c368de25d
INFO  : Session is already open
INFO  : Dag name: INSERT INTO TAB_SCAPI_DIGNASMBP_17...FIVE;") (Stage-1)
INFO  : Tez session was closed. Reopening...
INFO  : Session re-established.
INFO  : Session re-established.
INFO  : Status: Running (Executing on YARN cluster with App id application_1728390353038_0009)

----------------------------------------------------------------------------------------------
        VERTICES      MODE        STATUS  TOTAL  COMPLETED  RUNNING  PENDING  FAILED  KILLED
----------------------------------------------------------------------------------------------
Map 1 .......... container     SUCCEEDED      1          1        0        0       0       0
Reducer 2 ...... container     SUCCEEDED      1          1        0        0       0       0
----------------------------------------------------------------------------------------------
VERTICES: 02/02  [==========================&amp;gt;&amp;gt;] 100%  ELAPSED TIME: 9.37 s
----------------------------------------------------------------------------------------------
INFO  : Status: DAG finished successfully in 8.10 seconds
INFO  :
INFO  : Query Execution Summary
INFO  : ----------------------------------------------------------------------------------------------
INFO  : OPERATION                            DURATION
INFO  : ----------------------------------------------------------------------------------------------
INFO  : Compile Query                           1.66s
INFO  : Prepare Plan                            0.36s
INFO  : Get Query Coordinator (AM)              0.01s
INFO  : Submit Plan                             7.02s
INFO  : Start DAG                               0.11s
INFO  : Run DAG                                 8.10s
INFO  : ----------------------------------------------------------------------------------------------
INFO  :
INFO  : Task Execution Summary
INFO  : ----------------------------------------------------------------------------------------------
INFO  :   VERTICES      DURATION(ms)   CPU_TIME(ms)    GC_TIME(ms)   INPUT_RECORDS   OUTPUT_RECORDS
INFO  : ----------------------------------------------------------------------------------------------
INFO  :      Map 1           3598.00          5,920            102               3                1
INFO  :  Reducer 2            365.00            830              0               1                0
INFO  : ----------------------------------------------------------------------------------------------
INFO  :
INFO  : org.apache.tez.common.counters.DAGCounter:
INFO  :    NUM_SUCCEEDED_TASKS: 2
INFO  :    TOTAL_LAUNCHED_TASKS: 2
INFO  :    RACK_LOCAL_TASKS: 1
INFO  :    AM_CPU_MILLISECONDS: 3450
INFO  :    AM_GC_TIME_MILLIS: 21
INFO  : File System Counters:
INFO  :    FILE_BYTES_READ: 141
INFO  :    FILE_BYTES_WRITTEN: 141
INFO  :    HDFS_BYTES_WRITTEN: 373
INFO  :    HDFS_READ_OPS: 5
INFO  :    HDFS_WRITE_OPS: 5
INFO  :    HDFS_OP_CREATE: 3
INFO  :    HDFS_OP_GET_FILE_STATUS: 5
INFO  :    HDFS_OP_RENAME: 2
INFO  : org.apache.tez.common.counters.TaskCounter:
INFO  :    SPILLED_RECORDS: 0
INFO  :    NUM_SHUFFLED_INPUTS: 1
INFO  :    NUM_FAILED_SHUFFLE_INPUTS: 0
INFO  :    GC_TIME_MILLIS: 102
INFO  :    TASK_DURATION_MILLIS: 3845
INFO  :    CPU_MILLISECONDS: 6750
INFO  :    PHYSICAL_MEMORY_BYTES: 4227858432
INFO  :    VIRTUAL_MEMORY_BYTES: 10972979200
INFO  :    COMMITTED_HEAP_BYTES: 4227858432
INFO  :    INPUT_RECORDS_PROCESSED: 5
INFO  :    INPUT_SPLIT_LENGTH_BYTES: 1
INFO  :    OUTPUT_RECORDS: 1
INFO  :    OUTPUT_LARGE_RECORDS: 0
INFO  :    OUTPUT_BYTES: 88
INFO  :    OUTPUT_BYTES_WITH_OVERHEAD: 96
INFO  :    OUTPUT_BYTES_PHYSICAL: 133
INFO  :    ADDITIONAL_SPILLS_BYTES_WRITTEN: 0
INFO  :    ADDITIONAL_SPILLS_BYTES_READ: 0
INFO  :    ADDITIONAL_SPILL_COUNT: 0
INFO  :    SHUFFLE_BYTES: 109
INFO  :    SHUFFLE_BYTES_DECOMPRESSED: 96
INFO  :    SHUFFLE_BYTES_TO_MEM: 0
INFO  :    SHUFFLE_BYTES_TO_DISK: 0
INFO  :    SHUFFLE_BYTES_DISK_DIRECT: 109
INFO  :    SHUFFLE_PHASE_TIME: 65
INFO  :    FIRST_EVENT_RECEIVED: 39
INFO  :    LAST_EVENT_RECEIVED: 39
INFO  :    DATA_BYTES_VIA_EVENT: 0
INFO  : HIVE:
INFO  :    CREATED_FILES: 2
INFO  :    DESERIALIZE_ERRORS: 0
INFO  :    RECORDS_IN_Map_1: 3
INFO  :    RECORDS_OUT_0: 1
INFO  :    RECORDS_OUT_1_default.tab_scapi_dignasmbp_1728048648203: 5
INFO  :    RECORDS_OUT_INTERMEDIATE_Map_1: 1
INFO  :    RECORDS_OUT_INTERMEDIATE_Reducer_2: 0
INFO  :    RECORDS_OUT_OPERATOR_FS_12: 1
INFO  :    RECORDS_OUT_OPERATOR_FS_5: 5
INFO  :    RECORDS_OUT_OPERATOR_GBY_10: 1
INFO  :    RECORDS_OUT_OPERATOR_GBY_8: 1
INFO  :    RECORDS_OUT_OPERATOR_MAP_0: 0
INFO  :    RECORDS_OUT_OPERATOR_RS_9: 1
INFO  :    RECORDS_OUT_OPERATOR_SEL_1: 1
INFO  :    RECORDS_OUT_OPERATOR_SEL_3: 5
INFO  :    RECORDS_OUT_OPERATOR_SEL_7: 5
INFO  :    RECORDS_OUT_OPERATOR_TS_0: 1
INFO  :    RECORDS_OUT_OPERATOR_UDTF_2: 5
INFO  :    TOTAL_TABLE_ROWS_WRITTEN: 5
INFO  : TaskCounter_Map_1_INPUT__dummy_table:
INFO  :    INPUT_RECORDS_PROCESSED: 4
INFO  :    INPUT_SPLIT_LENGTH_BYTES: 1
INFO  : TaskCounter_Map_1_OUTPUT_Reducer_2:
INFO  :    ADDITIONAL_SPILLS_BYTES_READ: 0
INFO  :    ADDITIONAL_SPILLS_BYTES_WRITTEN: 0
INFO  :    ADDITIONAL_SPILL_COUNT: 0
INFO  :    DATA_BYTES_VIA_EVENT: 0
INFO  :    OUTPUT_BYTES: 88
INFO  :    OUTPUT_BYTES_PHYSICAL: 133
INFO  :    OUTPUT_BYTES_WITH_OVERHEAD: 96
INFO  :    OUTPUT_LARGE_RECORDS: 0
INFO  :    OUTPUT_RECORDS: 1
INFO  :    SPILLED_RECORDS: 0
INFO  : TaskCounter_Reducer_2_INPUT_Map_1:
INFO  :    FIRST_EVENT_RECEIVED: 39
INFO  :    INPUT_RECORDS_PROCESSED: 1
INFO  :    LAST_EVENT_RECEIVED: 39
INFO  :    NUM_FAILED_SHUFFLE_INPUTS: 0
INFO  :    NUM_SHUFFLED_INPUTS: 1
INFO  :    SHUFFLE_BYTES: 109
INFO  :    SHUFFLE_BYTES_DECOMPRESSED: 96
INFO  :    SHUFFLE_BYTES_DISK_DIRECT: 109
INFO  :    SHUFFLE_BYTES_TO_DISK: 0
INFO  :    SHUFFLE_BYTES_TO_MEM: 0
INFO  :    SHUFFLE_PHASE_TIME: 65
INFO  : TaskCounter_Reducer_2_OUTPUT_out_Reducer_2:
INFO  :    OUTPUT_RECORDS: 0
INFO  : Starting task [Stage-2:DEPENDENCY_COLLECTION] in serial mode
INFO  : Starting task [Stage-0:MOVE] in serial mode
----------------------------------------------------------------------------------------------
        VERTICES      MODE        STATUS  TOTAL  COMPLETED  RUNNING  PENDING  FAILED  KILLED
----------------------------------------------------------------------------------------------
Map 1 .......... container     SUCCEEDED      1          1        0        0       0       0
Reducer 2 ...... container     SUCCEEDED      1          1        0        0       0       0  ze=67, rawDataSize=62, numFilesErasureCoded=0]
----------------------------------------------------------------------------------------------
VERTICES: 02/02  [==========================&amp;gt;&amp;gt;] 100%  ELAPSED TIME: 9.40 s
----------------------------------------------------------------------------------------------
5 rows affected (18.56 seconds)
0: jdbc:hive2://node2.playground-ggangadharan&amp;gt; select * from TAB_SCAPI_DIGNASMBP_1728048648203;
INFO  : Compiling command(queryId=hive_20241009052444_2f65b80f-2ad3-412e-8ac4-03d8987a02db): select * from TAB_SCAPI_DIGNASMBP_1728048648203
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_scapi_dignasmbp_1728048648203.c1, type:int, comment:null), FieldSchema(name:tab_scapi_dignasmbp_1728048648203.c2, type:varchar(20), comment:null)], properties:null)
INFO  : Completed compiling command(queryId=hive_20241009052444_2f65b80f-2ad3-412e-8ac4-03d8987a02db); Time taken: 0.352 seconds
INFO  : Executing command(queryId=hive_20241009052444_2f65b80f-2ad3-412e-8ac4-03d8987a02db): select * from TAB_SCAPI_DIGNASMBP_1728048648203
INFO  : Completed executing command(queryId=hive_20241009052444_2f65b80f-2ad3-412e-8ac4-03d8987a02db); Time taken: 0.013 seconds
INFO  : OK
+---------------------------------------+---------------------------------------+
| tab_scapi_dignasmbp_1728048648203.c1  | tab_scapi_dignasmbp_1728048648203.c2  |
+---------------------------------------+---------------------------------------+
| 1                                     | |VALUE ONE;                           |
| 2                                     | |VALUE TWO;                           |
| 3                                     | |NULL;                                |
| 4                                     | |VALUE FOUR;                          |
| 5                                     | |VALUE FIVE;                          |
+---------------------------------------+---------------------------------------+
5 rows selected (0.587 seconds)
0: jdbc:hive2://node2.playground-ggangadharan&amp;gt; desc formatted TAB_SCAPI_DIGNASMBP_1728048648203;
INFO  : Compiling command(queryId=hive_20241009052507_76fa8fa5-9105-4cc5-adb8-6f93a6050c9c): desc formatted TAB_SCAPI_DIGNASMBP_1728048648203
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:col_name, type:string, comment:from deserializer), FieldSchema(name:data_type, type:string, comment:from deserializer), FieldSchema(name:comment, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=hive_20241009052507_76fa8fa5-9105-4cc5-adb8-6f93a6050c9c); Time taken: 0.106 seconds
INFO  : Executing command(queryId=hive_20241009052507_76fa8fa5-9105-4cc5-adb8-6f93a6050c9c): desc formatted TAB_SCAPI_DIGNASMBP_1728048648203
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : Completed executing command(queryId=hive_20241009052507_76fa8fa5-9105-4cc5-adb8-6f93a6050c9c); Time taken: 0.169 seconds
INFO  : OK
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
|           col_name            |                     data_type                      |                      comment                       |
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
| c1                            | int                                                |                                                    |
| c2                            | varchar(20)                                        |                                                    |
|                               | NULL                                               | NULL                                               |
| # Detailed Table Information  | NULL                                               | NULL                                               |
| Database:                     | default                                            | NULL                                               |
| OwnerType:                    | USER                                               | NULL                                               |
| Owner:                        | hive                                               | NULL                                               |
| CreateTime:                   | Wed Oct 09 05:18:20 UTC 2024                       | NULL                                               |
| LastAccessTime:               | UNKNOWN                                            | NULL                                               |
| Retention:                    | 0                                                  | NULL                                               |
| Location:                     | hdfs://node4.playground-ggangadharan.coelab.cloudera.com:8020/warehouse/tablespace/external/hive/tab_scapi_dignasmbp_1728048648203 | NULL                                               |
| Table Type:                   | EXTERNAL_TABLE                                     | NULL                                               |
| Table Parameters:             | NULL                                               | NULL                                               |
|                               | COLUMN_STATS_ACCURATE                              | {\"BASIC_STATS\":\"true\",\"COLUMN_STATS\":{\"c1\":\"true\",\"c2\":\"true\"}} |
|                               | EXTERNAL                                           | TRUE                                               |
|                               | bucketing_version                                  | 2                                                  |
|                               | numFiles                                           | 1                                                  |
|                               | numRows                                            | 5                                                  |
|                               | rawDataSize                                        | 62                                                 |
|                               | totalSize                                          | 67                                                 |
|                               | transient_lastDdlTime                              | 1728451483                                         |
|                               | NULL                                               | NULL                                               |
| # Storage Information         | NULL                                               | NULL                                               |
| SerDe Library:                | org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe | NULL                                               |
| InputFormat:                  | org.apache.hadoop.mapred.TextInputFormat           | NULL                                               |
| OutputFormat:                 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat | NULL                                               |
| Compressed:                   | No                                                 | NULL                                               |
| Num Buckets:                  | -1                                                 | NULL                                               |
| Bucket Columns:               | []                                                 | NULL                                               |
| Sort Columns:                 | []                                                 | NULL                                               |
| Storage Desc Params:          | NULL                                               | NULL                                               |
|                               | field.delim                                        | ,                                                  |
|                               | line.delim                                         | \n                                                 |
|                               | serialization.format                               | ,                                                  |
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
34 rows selected (0.363 seconds)
0: jdbc:hive2://node2.playground-ggangadharan&amp;gt; dfs -ls hdfs://node4.playground-ggangadharan.coelab.cloudera.com:8020/warehouse/tablespace/external/hive/tab_scapi_dignasmbp_1728048648203
. . . . . . . . . . . . . . . . . . . . . . .&amp;gt; ;
Error: Error while processing statement: Permission denied: user [hive] does not have privilege for [DFS] command (state=,code=1)
0: jdbc:hive2://node2.playground-ggangadharan&amp;gt; !sh hdfs dfs -ls dfs -ls hdfs://node4.playground-ggangadharan.coelab.cloudera.com:8020/warehouse/tablespace/external/hive/tab_scapi_dignasmbp_1728048648203
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-7.1.9-1.cdh7.1.9.p0.44702451/jars/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-7.1.9-1.cdh7.1.9.p0.44702451/jars/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
ls: `dfs': No such file or directory
ls: `-ls': No such file or directory
Found 1 items
-rw-r--r--   3 hive hive         67 2024-10-09 05:24 hdfs://node4.playground-ggangadharan.coelab.cloudera.com:8020/warehouse/tablespace/external/hive/tab_scapi_dignasmbp_1728048648203/000000_0
Command failed with exit code = 1
0: jdbc:hive2://node2.playground-ggangadharan&amp;gt; !sh hdfs dfs -cat hdfs://node4.playground-ggangadharan.coelab.cloudera.com:8020/warehouse/tablespace/external/hive/tab_scapi_dignasmbp_1728048648203/000000_0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-7.1.9-1.cdh7.1.9.p0.44702451/jars/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-7.1.9-1.cdh7.1.9.p0.44702451/jars/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
1,|VALUE ONE;
2,|VALUE TWO;
3,|NULL;
4,|VALUE FOUR;
5,|VALUE FIVE;
0: jdbc:hive2://node2.playground-ggangadharan&amp;gt;&lt;/LI-CODE&gt;&lt;P&gt;&lt;SPAN&gt;If you prefer not to include the "|" or ";" symbol , please modify the insert statement accordingly. Additionally, if you are reading the text file generated by an external process, it is recommended to adjust the delimiter accordingly to ensure that only the values are displayed in the result.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 09 Oct 2024 05:34:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394807#M248806</guid>
      <dc:creator>ggangadharan</dc:creator>
      <dc:date>2024-10-09T05:34:50Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394840#M248814</link>
      <description>&lt;P&gt;Still getting compilation error.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I tried to update hive-site.xml with following properties:&lt;/P&gt;&lt;P&gt;&amp;lt;property&amp;gt;&lt;BR /&gt;&amp;lt;name&amp;gt;hive.exec.dynamic.partition.mode&amp;lt;/name&amp;gt;&lt;BR /&gt;&amp;lt;value&amp;gt;nonstrict&amp;lt;/value&amp;gt;&lt;BR /&gt;&amp;lt;/property&amp;gt;&lt;BR /&gt;&amp;lt;property&amp;gt;&lt;BR /&gt;&amp;lt;name&amp;gt;hive.support.sql11.reserved.keywords&amp;lt;/name&amp;gt;&lt;BR /&gt;&amp;lt;value&amp;gt;false&amp;lt;/value&amp;gt;&lt;BR /&gt;&amp;lt;/property&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But still didn't work. Failed to compile.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 09 Oct 2024 18:04:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394840#M248814</guid>
      <dc:creator>explorer77</dc:creator>
      <dc:date>2024-10-09T18:04:28Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394861#M248819</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Could you please attempt the task using Beeline instead of DBeaver? This will help us determine if there are any misconfigurations specific to DBeaver. Additionally, please ensure that the Cloudera Hive JDBC/ODBC driver is compatible with cluster version.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Oct 2024 05:32:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/394861#M248819</guid>
      <dc:creator>ggangadharan</dc:creator>
      <dc:date>2024-10-10T05:32:43Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395093#M248862</link>
      <description>&lt;P&gt;Beeline output:&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;0: jdbc:hive2://conops-kdc1.fyre.ibm.com:2181&amp;gt; INSERT INTO example (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;");&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Compiling command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986): INSERT INTO example (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;")&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Semantic Analysis Completed (retrial = false)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null), FieldSchema(name:_col1, type:varchar(20), comment:null)], properties:null)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Completed compiling command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986); Time taken: 0.656 seconds&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Executing command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986): INSERT INTO example (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;")&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Query ID = hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Total jobs = 1&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Launching Job 1 out of 1&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Starting task [Stage-1:MAPRED] in serial mode&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Subscribed to counters: [] for queryId: hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Tez session hasn't been created yet. Opening session&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;ERROR : Failed to execute tez graph.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1726980746968_0077 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1726980746968_0077_000001 exited with&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;exitCode: -1000&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;Failing this attempt.Diagnostics: [2024-10-13 10:30:27.706]Application application_1726980746968_0077 initialization failed (exitCode=255) with output: main : command provided 0&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;main : run as user is tm_cc_user_2&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;main : requested yarn user is tm_cc_user_2&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;User tm_cc_user_2 not found&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;For more detailed output, check the application tracking page: &lt;A href="https://conops-kdc1.fyre.ibm.com:8090/cluster/app/application_1726980746968_0077" target="_blank"&gt;https://conops-kdc1.fyre.ibm.com:8090/cluster/app/application_1726980746968_0077&lt;/A&gt; Then click on links to logs of each attempt.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;. Failing the application.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:979) ~[tez-api-0.9.1.7.1.8.0-801.jar:0.9.1.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:948) ~[tez-api-0.9.1.7.1.8.0-801.jar:0.9.1.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.startSessionAndContainers(TezSessionState.java:567) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:385) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:300) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession.open(TezSessionPoolSession.java:106) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezTask.ensureSessionHasResources(TezTask.java:410) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:215) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:213) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:357) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:330) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:246) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:109) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:749) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Driver.run(Driver.java:504) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Driver.run(Driver.java:498) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:166) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:88) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:327) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.security.AccessController.doPrivileged(Native Method) ~[?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at javax.security.auth.Subject.doAs(Subject.java:423) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898) [hadoop-common-3.1.1.7.1.8.0-801.jar:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:345) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.lang.Thread.run(Thread.java:834) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Completed executing command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986); Time taken: 7.11 seconds&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: OK&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;Error: Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (state=08S01,code=1)&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 13 Oct 2024 17:43:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395093#M248862</guid>
      <dc:creator>explorer777</dc:creator>
      <dc:date>2024-10-13T17:43:57Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395100#M248864</link>
      <description>&lt;P&gt;Beeline output:&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;jdbc:hive2://conops-kdc1.fyre.ibm.com:2181&amp;gt; use tm_cc_db_1;&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Compiling command(queryId=hive_20241013103011_abc6ea02-3f0f-455a-8925-934628f3c2a8): use tm_cc_db_1&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Semantic Analysis Completed (retrial = false)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Created Hive schema: Schema(fieldSchemas:null, properties:null)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Completed compiling command(queryId=hive_20241013103011_abc6ea02-3f0f-455a-8925-934628f3c2a8); Time taken: 0.022 seconds&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Executing command(queryId=hive_20241013103011_abc6ea02-3f0f-455a-8925-934628f3c2a8): use tm_cc_db_1&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Starting task [Stage-0:DDL] in serial mode&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Completed executing command(queryId=hive_20241013103011_abc6ea02-3f0f-455a-8925-934628f3c2a8); Time taken: 0.025 seconds&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: OK&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;No rows affected (0.205 seconds)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;0: jdbc:hive2://conops-kdc1.fyre.ibm.com:2181&amp;gt; INSERT INTO example (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;");&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Compiling command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986): INSERT INTO example (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;")&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Semantic Analysis Completed (retrial = false)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null), FieldSchema(name:_col1, type:varchar(20), comment:null)], properties:null)&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Completed compiling command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986); Time taken: 0.656 seconds&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Executing command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986): INSERT INTO example (C1, C2) VALUES (1,"|VALUE ONE;"), (2,"|VALUE TWO;"), (3,"|NULL;"), (4,"|VALUE FOUR;"), (5,"|VALUE FIVE;")&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Query ID = hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Total jobs = 1&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Launching Job 1 out of 1&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Starting task [Stage-1:MAPRED] in serial mode&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Subscribed to counters: [] for queryId: hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Tez session hasn't been created yet. Opening session&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;ERROR : Failed to execute tez graph.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1726980746968_0077 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1726980746968_0077_000001 exited with&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;exitCode: -1000&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;Failing this attempt.Diagnostics: [2024-10-13 10:30:27.706]Application application_1726980746968_0077 initialization failed (exitCode=255) with output: main : command provided 0&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;main : run as user is tm_cc_user_2&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;main : requested yarn user is tm_cc_user_2&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;User tm_cc_user_2 not found&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;For more detailed output, check the application tracking page: &lt;A href="https://conops-kdc1.fyre.ibm.com:8090/cluster/app/application_1726980746968_0077" target="_blank"&gt;https://conops-kdc1.fyre.ibm.com:8090/cluster/app/application_1726980746968_0077&lt;/A&gt; Then click on links to logs of each attempt.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;. Failing the application.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:979) ~[tez-api-0.9.1.7.1.8.0-801.jar:0.9.1.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:948) ~[tez-api-0.9.1.7.1.8.0-801.jar:0.9.1.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.startSessionAndContainers(TezSessionState.java:567) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:385) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:300) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession.open(TezSessionPoolSession.java:106) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezTask.ensureSessionHasResources(TezTask.java:410) ~[hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:215) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:213) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:357) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:330) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:246) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:109) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:749) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Driver.run(Driver.java:504) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.Driver.run(Driver.java:498) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:166) [hive-exec-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:88) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:327) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.security.AccessController.doPrivileged(Native Method) ~[?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at javax.security.auth.Subject.doAs(Subject.java:423) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898) [hadoop-common-3.1.1.7.1.8.0-801.jar:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:345) [hive-service-3.1.3000.7.1.8.0-801.jar:3.1.3000.7.1.8.0-801]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;at java.lang.Thread.run(Thread.java:834) [?:?]&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: Completed executing command(queryId=hive_20241013103020_83c658f9-1969-48bc-9b7b-808ba9b45986); Time taken: 7.11 seconds&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;INFO&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;: OK&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&lt;SPAN class="s1"&gt;Error: Error while compiling statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (state=08S01,code=1)&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Oct 2024 03:50:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395100#M248864</guid>
      <dc:creator>explorer777</dc:creator>
      <dc:date>2024-10-14T03:50:20Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395347#M248932</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/118157"&gt;@Arathi&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can you please open a case on the Cloudera Support Portal.&lt;/P&gt;&lt;P&gt;Please attach the application log, hiveserver2 logs from the time period the job failed, and the beeline console output from the failed query.&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 16 Oct 2024 20:44:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395347#M248932</guid>
      <dc:creator>MGreen</dc:creator>
      <dc:date>2024-10-16T20:44:24Z</dc:date>
    </item>
    <item>
      <title>Re: Getting error during batch execution of records in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395957#M249066</link>
      <description>&lt;P&gt;From the below , it does looks like tez session itself not initialized.&amp;nbsp; Validate configuration from set -v , make sure everything fine and try re-running the query.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1726980746968_0077 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1726980746968_0077_000001 exited with  exitCode: -1000

Failing this attempt.Diagnostics: [2024-10-13 10:30:27.706]Application application_1726980746968_0077 initialization failed (exitCode=255) with output: main : command provided 0&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;If you can't able to identify the incorrect configuration , raise a support case for the same.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 25 Oct 2024 12:11:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Getting-error-during-batch-execution-of-records-in-Hive/m-p/395957#M249066</guid>
      <dc:creator>ggangadharan</dc:creator>
      <dc:date>2024-10-25T12:11:23Z</dc:date>
    </item>
  </channel>
</rss>

