<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413199#M253923</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/20288"&gt;@Shelton&lt;/a&gt;&amp;nbsp;still same&amp;nbsp;&lt;BR /&gt;JDK version used in the Hadoop cluster( for both 7.1.9 &amp;amp; 7.3.1 ).&lt;/P&gt;&lt;P&gt;dev - 7.3.1 -&amp;gt; java --version&lt;/P&gt;&lt;P&gt;openjdk 17.0.16 2025-07-15 LTS&lt;BR /&gt;OpenJDK Runtime Environment (Red_Hat-17.0.16.0.8-1) (build 17.0.16+8-LTS)&lt;BR /&gt;OpenJDK 64-Bit Server VM (Red_Hat-17.0.16.0.8-1) (build 17.0.16+8-LTS, mixed mode, sharing)&lt;/P&gt;&lt;P&gt;Prod 7.1.9 -&amp;gt;&amp;nbsp;java -version&lt;BR /&gt;&lt;BR /&gt;openjdk version "1.8.0_462"&lt;BR /&gt;OpenJDK Runtime Environment (build 1.8.0_462-b08)&lt;BR /&gt;OpenJDK 64-Bit Server VM (build 25.462-b08, mixed mode)&lt;/P&gt;&lt;P&gt;&lt;A href="https://supportmatrix.cloudera.com/" target="_blank" rel="noopener"&gt;https://supportmatrix.cloudera.com/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/cdp-private-cloud-base-installation/topics/cdpdc-java-requirements.html" target="_blank" rel="noopener"&gt;https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/cdp-private-cloud-base-installation/topics/cdpdc-java-requirements.html&lt;/A&gt;&lt;BR /&gt;&lt;SPAN&gt;&lt;BR /&gt;From cloudera docs: If you are using JDK 17 on your cluster, you must add the following JVM options to the service:&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN class="cdoc-line"&gt;--add-opens=java.base/java.lang=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-opens=java.management/com.sun.jmx.mbeanserver=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-exports=java.management/com.sun.jmx.mbeanserver=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-exports=java.base/sun.net.dns=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-exports=java.base/sun.net.util=ALL-UNNAMED&lt;/SPAN&gt;
&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;to ensure the jobs run successfully.&lt;BR /&gt;&lt;BR /&gt;any guidance on this might the issue for job failures.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 25 Dec 2025 08:35:10 GMT</pubDate>
    <dc:creator>Amr5</dc:creator>
    <dc:date>2025-12-25T08:35:10Z</dc:date>
    <item>
      <title>Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/412934#M253786</link>
      <description>&lt;DIV&gt;Please find the below screenshot and attached logs.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Error:&lt;/DIV&gt;&lt;DIV&gt;SEVERE: The Integration Service failed to run the Hive task [Write_test_1jiq2sdd]. See the additional error messages for more information.&lt;/DIV&gt;&lt;DIV&gt;com.informatica.sdk.dtm.ExecutionException: [[HIVE_1070] The Integration Service failed to run Hive query [Write_test_1jiq2sdd_query_3] for task [Write_test_1jiq2sdd] due to following error: Hive error code [2], Hive message [Error while compiling statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1764063775281_0054_1_00, diagnostics=[Vertex vertex_1764063775281_0054_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, java.util.concurrent.ExecutionException: org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;2025-11-26 14:25:24,278 AST &amp;lt;LdtmWorkflowTask-pool-6-thread-3&amp;gt; INFO: HADOOP_MAPRED_HOME is /data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib&lt;/DIV&gt;&lt;DIV&gt;2025-11-26 14:25:24,524 AST &amp;lt;LdtmWorkflowTask-pool-6-thread-3&amp;gt; INFO: Could not rename /tmp/sqoop-infadpdev/887867565228272/vrp_branch.java to /tmp/sqoop-infadpdev/887867565228272/vrp_branch.java. Error: File element in parameter 'null' already exists: '/tmp/sqoop-infadpdev/887867565228272/vrp_branch.java'&lt;/DIV&gt;&lt;DIV&gt;2025-11-26 14:25:24,524 AST &amp;lt;LdtmWorkflowTask-pool-6-thread-3&amp;gt; INFO: Writing jar file: /tmp/sqoop-infadpdev/887867565228272/vrp_branch.jar&lt;/DIV&gt;&lt;DIV&gt;2025-11-26 14:25:24,655 AST &amp;lt;LdtmWorkflowTask-pool-6-thread-3&amp;gt; INFO: Destination directory hdfs://saibdev/data_lakehouse/tables/raw/SPARK_k164prda/sqoop_staging/S7943081602786963440/CUSOMTER_DO_SRC_20698ca1de97472bbab0dfa0331a4a93 is not present, hence not deleting.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;_write_ing_m_dynamic_create.vrp_BranchSubnet as a58, w7943081602786963440_write_ing_m_dynamic_create.vrp_BranchTypeCode as a59, w7943081602786963440_write_ing_m_dynamic_create.vrp_DefaultOfficerCode as a60, w7943081602786963440_write_ing_m_dynamic_create.vrp_DefaultSundryCode as a61, w7943081602786963440_write_ing_m_dynamic_create.vrp_Reserved as a62, w7943081602786963440_write_ing_m_dynamic_create.vrp_Unitmnemonic as a63, w7943081602786963440_write_ing_m_dynamic_create.vrp_Zipcode as a64, w7943081602786963440_write_ing_m_dynamic_create.vrp_SalesQueue as a65, w7943081602786963440_write_ing_m_dynamic_create.vrp_BranchApproverQueue as a66, w7943081602786963440_write_ing_m_dynamic_create.vrp_CustomerClassification as a67, w7943081602786963440_write_ing_m_dynamic_create.vrp_SeniorBSRQueue as a68, w7943081602786963440_write_ing_m_dynamic_create.vrp_seniorbsrteam as a69, w7943081602786963440_write_ing_m_dynamic_create.vrp_BranchAccountNumber as a70 FROM default.w7943081602786963440_write_ing_m_dynamic_create&lt;/DIV&gt;&lt;DIV&gt;2025-11-26 14:31:10,660 AST &amp;lt;LdtmWorkflowTask-pool-6-thread-4&amp;gt; WARNING: java.sql.SQLException: Error while compiling statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1764063775281_0133_1_00, diagnostics=[Vertex vertex_1764063775281_0133_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, java.util.concurrent.ExecutionException: org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:592)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:571)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.createInitializerWrappers(RootInputInitializerManager.java:140)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInputInitializers(RootInputInitializerManager.java:111)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.tez.dag.app.dag.impl.VertexImpl.setupInputInitializerManager(VertexImpl.java:4147)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.tez.dag.app.dag.impl.VertexImpl.access$3100(VertexImpl.java:210)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Caused by: java.lang.ExceptionInInitializerError&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.hadoop.hive.ql.plan.TableDesc.setProperties(TableDesc.java:131)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.hadoop.hive.ql.plan.TableDesc.&amp;lt;init&amp;gt;(TableDesc.java:69)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.hadoop.hive.ql.exec.Utilities.&amp;lt;clinit&amp;gt;(Utilities.java:706)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.&amp;lt;init&amp;gt;(HiveSplitGenerator.java:150)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;... 21 more&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;BR /&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="snap.png" style="width: 999px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/46442i48A2F7491AF8339C/image-size/large?v=v2&amp;amp;px=999" role="button" title="snap.png" alt="snap.png" /&gt;&lt;/span&gt;</description>
      <pubDate>Tue, 21 Apr 2026 06:11:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/412934#M253786</guid>
      <dc:creator>Amr5</dc:creator>
      <dc:date>2026-04-21T06:11:42Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413047#M253817</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/136584"&gt;@Amr5&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;From the logs you shared the core issue is&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask
Vertex failed: INIT_FAILURE
Unable to instantiate class with 1 arguments:org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
Caused by: java.lang.ExceptionInInitializerError&lt;/LI-CODE&gt;&lt;P&gt;This indicates a &lt;STRONG&gt;classpath/library compatibility issue&lt;/STRONG&gt; between Informatica and the upgraded Cloudera CDP cluster, specifically with Tez and Hive components.&amp;nbsp;&lt;BR /&gt;Root Causes:&lt;/P&gt;&lt;OL class=""&gt;&lt;LI&gt;&lt;STRONG&gt;Version Mismatch&lt;/STRONG&gt;: The Informatica integration is pointing to CDH 7.218 libraries, your&amp;nbsp;&lt;FONT size="2" color="#993300"&gt;HADOOP_MAPRED_HOME is /data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib&amp;nbsp;&amp;nbsp;but &lt;FONT size="3" color="#000000"&gt;your cluster was upgraded to a newer CDP version with incompatible Hive/Tez&lt;BR /&gt;libraries.&lt;/FONT&gt;&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;&lt;FONT size="2" color="#993300"&gt;&lt;FONT size="3" color="#000000"&gt;&lt;STRONG&gt;Class Initialization Failure&lt;/STRONG&gt;: The&amp;nbsp;&lt;FONT color="#993300"&gt;HiveSplitGenerator&lt;/FONT&gt;&amp;nbsp;class cannot be instantiated, likely due to missing or incompatible dependencies.&lt;/FONT&gt;&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;Static Initializer Problem&lt;/STRONG&gt;&lt;SPAN&gt;: The&amp;nbsp; &lt;/SPAN&gt;&lt;FONT color="#993300"&gt;ExceptionInInitializerError&lt;/FONT&gt;&lt;SPAN&gt; suggests a static block in one of the Hive classes is failing during initialization.&lt;/SPAN&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;Solution 1&lt;BR /&gt;&lt;/STRONG&gt;&lt;STRONG&gt;Update the Informatica Hadoop connection&lt;/STRONG&gt;:&lt;/P&gt;&lt;UL class=""&gt;&lt;LI&gt;Go to Administrator → Connections&lt;/LI&gt;&lt;LI&gt;Edit your Hadoop connection&lt;/LI&gt;&lt;LI&gt;Update the Hadoop distribution version to match your new CDP version&lt;/LI&gt;&lt;LI&gt;Update the configuration files (core-site.xml, hdfs-site.xml, hive-site.xml, etc.)&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;STRONG&gt;Update Hadoop libraries&lt;/STRONG&gt;:&lt;BR /&gt;&lt;SPAN&gt;&lt;SPAN class="token"&gt;Copy new CDP client libraries to Informatica&lt;/SPAN&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;&lt;SPAN class="token"&gt;cp&lt;/SPAN&gt; -r /opt/cloudera/parcels/CDH/lib/* /data1/informatica/dei/services/shared/hadoop/CDP_&lt;SPAN class="token"&gt;&amp;lt;&lt;/SPAN&gt;version&lt;SPAN class="token"&gt;&amp;gt;&lt;/SPAN&gt;/lib/&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Restart Informatica services&lt;/STRONG&gt;:&lt;/P&gt;&lt;LI-SPOILER&gt;infaservice.sh stopService&lt;BR /&gt;infaservice.sh startService&lt;/LI-SPOILER&gt;&lt;P&gt;Additionally&amp;nbsp;Identify and copy missing Tez JARs&lt;/P&gt;&lt;LI-SPOILER&gt;&lt;FONT size="2"&gt;# From CDP cluster, copy Tez libraries&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;cp /opt/cloudera/parcels/CDH/lib/tez/*.jar /data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib/&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;# Copy Hive execution libraries&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;cp /opt/cloudera/parcels/CDH/lib/hive/lib/hive-exec-*.jar /data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib/&lt;/FONT&gt;&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Update classpathin Informatica domain configuration&lt;/STRONG&gt;&lt;BR /&gt;Configure Hive Execution Engine If Tez is causing issues, temporarily switch to MapReduce.&lt;/P&gt;&lt;P&gt;In your Hive connection properties, add:&lt;/P&gt;&lt;LI-SPOILER&gt;hive.execution.engine=mr&lt;/LI-SPOILER&gt;&lt;P&gt;Update this post after the above steps and always share the logs to enable us understand whats happening in your environment.&lt;BR /&gt;&lt;BR /&gt;Happy hadooping&lt;FONT size="2" color="#993300"&gt;&lt;FONT size="3" color="#000000"&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 08 Dec 2025 07:30:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413047#M253817</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2025-12-08T07:30:08Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413066#M253830</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/20288"&gt;@Shelton&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;&lt;BR /&gt;First Many thanks for sharing your insights with me much appreciated.&lt;BR /&gt;&lt;BR /&gt;we have re run this workflow again from informatica now again with some new details same errors are coming.&lt;BR /&gt;&lt;BR /&gt;Logs:&lt;/P&gt;&lt;P&gt;SEVERE: The Integration Service failed to run the Hive task [Write_test_76ti7ctz]. See the additional error messages for more information.&lt;BR /&gt;com.informatica.sdk.dtm.ExecutionException: [[HIVE_1070] The Integration Service failed to run Hive query [Write_test_76ti7ctz_query_3] for task [Write_test_76ti7ctz] due to following error: Hive error code [2], Hive message [Error while compiling statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1764836049228_0036_1_00, diagnostics=[Vertex vertex_1764836049228_0036_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, java.util.concurrent.ExecutionException: org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;2025-12-09 16:44:25,009 AST &amp;lt;LdtmWorkflowTask-pool-6-thread-8&amp;gt; INFO: Could not rename /tmp/sqoop-infadpdev/673817950266090/Fact_Bills_DD.java to /tmp/sqoop-infadpdev/673817950266090/Fact_Bills_DD.java. Error: File element in parameter 'null' already exists: '/tmp/sqoop-infadpdev/673817950266090/Fact_Bills_DD.java'&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;2025-12-09 16:44:21,704 AST &amp;lt;MappingCompiler-pool-4-thread-3&amp;gt; INFO: [LDTMCMN_0037] The Hadoop distribution directory is defined in Data Integration Service properties at the path [/data1/informatica/dei/services/shared/hadoop/CDH_7.218].&lt;BR /&gt;2025-12-09 16:44:21,704 AST &amp;lt;MappingCompiler-pool-4-thread-3&amp;gt; INFO: [CLUSTERCONF_10024] The cluster configuration [Cloudera_Dev] is unchanged from the last export. Using the existing export file [/data1/informatica/dei/tomcat/bin/disTemp/DOM_IDQ_DEV/DIS_DEI_DEV/node02_DEI_DEV/cloudera_dev/SPARK/665d244e-0368-4d51-8ac9-01e1ac851a1f/infacco-site.xml].&lt;BR /&gt;2025-12-09 16:44:21,704 AST &amp;lt;MappingCompiler-pool-4-thread-3&amp;gt; INFO: [CLUSTERCONF_10028] Based on the distribution [CLOUDERA] and the run-time engine [SPARK], the Data Integration Service will override the following cluster configuration properties at run time: \n&amp;nbsp; - fs.file.impl.disable.cache: true\n&amp;nbsp; - fs.hdfs.impl.disable.cache: true\n&amp;nbsp; - yarn.timeline-service.enabled: false&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;--------------------&lt;BR /&gt;&lt;BR /&gt;run this and locate command for hive-exec.jar on informatica server and see that all hive-exec.jars were on older cloudera version which was 7.1.9&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;sudo find / -type f -name "hive-exec.jar" 2&amp;gt;/dev/null&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator&lt;/SPAN&gt;&amp;nbsp;is in hive-exec.jar&lt;BR /&gt;&lt;BR /&gt;which is pointing to old path.&lt;/P&gt;</description>
      <pubDate>Tue, 09 Dec 2025 15:10:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413066#M253830</guid>
      <dc:creator>Amr5</dc:creator>
      <dc:date>2025-12-09T15:10:41Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413067#M253831</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/136584"&gt;@Amr5&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Just as you too realized&amp;nbsp; there is an old path issue.&amp;nbsp;The error indicates a &lt;STRONG&gt;version mismatch&lt;/STRONG&gt; between the Hive/Tez libraries being used by Informatica and those expected by your Cloudera cluster.&lt;BR /&gt;RCA&lt;BR /&gt;1. Informatica is using Hive libraries from an older Cloudera version (7.1.9)&lt;BR /&gt;2.&amp;nbsp;Your cluster is running Cloudera 7.2.18 (as shown in the path&amp;nbsp; &amp;nbsp;&lt;FONT color="#993300"&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;/FONT&gt;&lt;BR /&gt;3.&amp;nbsp;The &lt;FONT color="#993300"&gt;HiveSplitGenerator&lt;/FONT&gt; class in the old &lt;FONT color="#993300"&gt;hive-exec.jar&lt;/FONT&gt; is incompatible with the newer Tez runtime&lt;BR /&gt;&lt;STRONG&gt;Step 1: Locate Current Hive Libraries&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;sudo find /data1/informatica -type f -name "hive-exec*.jar" 2&amp;gt;/dev/null&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 2: Backup Old Libraries&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;cd /data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;BR /&gt;mkdir -p backup_old_hive_libs&lt;BR /&gt;mv hive-exec*.jar backup_old_hive_libs/&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 3: Copy Correct Hive Libraries from Cluster&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-SPOILER&gt;&lt;P&gt;# Find the correct hive-exec.jar on your Cloudera cluster&lt;BR /&gt;find /opt/cloudera/parcels -name "hive-exec*.jar" 2&amp;gt;/dev/null&lt;/P&gt;&lt;P&gt;# Copy it to Informatica's Hadoop distribution directory&lt;BR /&gt;cp /opt/cloudera/parcels/CDH-7.2.18*/lib/hive/lib/hive-exec-*.jar \&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/&lt;/P&gt;&lt;/LI-SPOILER&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;H3&gt;&lt;FONT size="4"&gt;&lt;STRONG&gt;Step 4: Update Informatica Hadoop Distribution&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/H3&gt;&lt;P class="font-claude-response-body break-words whitespace-normal "&gt;In &lt;STRONG&gt;Informatica Administrator Console&lt;/STRONG&gt;:&lt;/P&gt;&lt;OL class=""&gt;&lt;LI&gt;Navigate to &lt;STRONG&gt;Data Integration Service&lt;/STRONG&gt; → &lt;STRONG&gt;Properties&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;Go to &lt;STRONG&gt;Hadoop Connection&lt;/STRONG&gt; → &lt;STRONG&gt;Distribution&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;Verify it points to:&amp;nbsp;&lt;FONT color="#993300"&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;Click &lt;STRONG&gt;Test Connection&lt;/STRONG&gt; to validate&lt;/LI&gt;&lt;LI&gt;If needed, use &lt;STRONG&gt;Re-import Hadoop Configuration&lt;/STRONG&gt; to refresh cluster configs&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;Step 5: Restart Services&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;infaservice.sh dis restart -domain DOM_IDQ_DEV -service DIS_DEI_DEV&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 6: Clear Cached Compilation Files&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;&lt;FONT size="3"&gt;rm -rf /data1/informatica/dei/tomcat/bin/disTemp/DOM_IDQ_DEV/DIS_DEI_DEV/node02_DEI_DEV/cloudera_dev/SPARK/*&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3"&gt;rm -rf /tmp/sqoop-infadpdev/*&lt;/FONT&gt;&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 7: Re-run Your Mapping&lt;/STRONG&gt;&lt;BR /&gt;If you have multiple nodes in your Informatica cluster, repeat Steps 2-3 on &lt;STRONG&gt;all nodes&lt;/STRONG&gt; where the Data Integration Service runs.&lt;BR /&gt;&lt;BR /&gt;Happy hadooping&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 09 Dec 2025 15:56:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413067#M253831</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2025-12-09T15:56:25Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413069#M253833</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/20288"&gt;@Shelton&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;after following shared steps mentioned, we re run the mapping getting this error now:&lt;BR /&gt;&lt;BR /&gt;java.lang.NoSuchMethodError:&lt;BR /&gt;'org.apache.hadoop.hive.ql.parse.ParseResult org.apache.hadoop.hive.ql.parse.ParseDriver.parse(java.lang.String, org.apache.hadoop.conf.Configuration)'&lt;BR /&gt;&lt;BR /&gt;please also note this that old jars for hive-exec also there in path :&lt;BR /&gt;&lt;SPAN&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;data1/informatica/dei/clients/DeveloperClient/hadoop/CDH_7.218/lib/hive-exec-3.1.3000.7.2.18.0-641.jar&lt;BR /&gt;/data1/informatica/dei/clients/DeveloperClient/hadoop/CDH_7.218/lib/hive-exec-3.1.3000.7.3.1.100-57.jar&lt;BR /&gt;/data1/informatica/dei/clients/DeveloperClient/hadoop/CDH_7.218/spark/jars/hive-exec-3.1.3000.7.2.18.0-641.jar&lt;BR /&gt;*****/data1/informatica/dei/clients/DeveloperClient/hadoop/CDH_7.1/lib/hive-exec-3.1.3000.7.1.9.0-387.jar&lt;/P&gt;&lt;P&gt;/data1/informatica/dei/externaljdbcjars/hive-exec-3.1.3000.7.3.1.100-57.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib/hive-exec-3.1.3000.7.1.9.0-387.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib/hive-exec-3.1.3000.7.2.18.0-641.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/lib/hive-exec-3.1.3000.7.3.1.100-57.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/spark/jars/hive-exec-3.1.3000.7.1.9.0-387.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/spark/jars/hive-exec-3.1.3000.7.2.18.0-641.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/hadoop/CDH_7.218/spark/jars/hive-exec-3.1.3000.7.3.1.100-57.jar&lt;BR /&gt;*** /data1/informatica/dei/services/shared/hadoop/CDH_7.1/lib/hive-exec-3.1.3000.7.1.9.0-387.jar&lt;BR /&gt;*** /data1/informatica/dei/services/shared/hadoop/CDH_7.1/spark/jars/hive-exec-3.1.3000.7.1.9.0-387.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/spark/lib_spark/hive-exec-3.1.3000.7.1.9.0-387.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/spark/lib_spark/hive-exec-3.1.3000.7.2.18.0-641.jar&lt;BR /&gt;/data1/informatica/dei/services/shared/spark/lib_spark/hive-exec-3.1.3000.7.3.1.100-57.jar&lt;/P&gt;</description>
      <pubDate>Wed, 10 Dec 2025 12:04:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413069#M253833</guid>
      <dc:creator>Amr5</dc:creator>
      <dc:date>2025-12-10T12:04:03Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413070#M253834</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/136584"&gt;@Amr5&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The NoSuchMethodError means JAR conflict at runtime. You &lt;STRONG&gt;must&lt;/STRONG&gt; ensure that &lt;STRONG&gt;only CDH 7.2.18 Hive JARs&lt;/STRONG&gt; are in the classpath, with &lt;STRONG&gt;no remnants&lt;/STRONG&gt; of 7.1.9.&lt;BR /&gt;The ParseDriver.parse() method signature changed between Hive versions. In your case the Old Hive JARs (from CDH 7.1.9) are still present in /data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;BR /&gt;Java is loading the &lt;STRONG&gt;old hive-exec.jar&lt;/STRONG&gt; instead of the new one, causing method signature mismatches&lt;BR /&gt;&lt;STRONG&gt;Step 1. Identify ALL Old Hive JARs&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;find /data1/informatica/dei/services/shared/hadoop/CDH_7.218 -name "hive*.jar" -exec ls -lh {} \;&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 2: Remove ALL Old Hive JARs&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;cd /data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;BR /&gt;# Create backup directory if not exists&lt;BR /&gt;mkdir -p backup_all_old_hive_jars&lt;BR /&gt;# Move ALL hive-related JARs to backup&lt;BR /&gt;mv hive*.jar backup_all_old_hive_jars/&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 3: Copy ALL Correct Hive JARs from Cloudera Cluster&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;# Find Cloudera CDH 7.2.18 parcels location&lt;BR /&gt;CLOUDERA_PARCEL=$(find /opt/cloudera/parcels -maxdepth 1 -type d -name "CDH-7.2.18*" | head -1)&lt;BR /&gt;# Copy ALL Hive JARs&lt;BR /&gt;cp $CLOUDERA_PARCEL/lib/hive/lib/hive*.jar /data1/informatica/dei/services/shared/hadoop/CDH_7.218/&lt;BR /&gt;# Also copy Hive dependencies&lt;BR /&gt;cp $CLOUDERA_PARCEL/jars/hive*.jar /data1/informatica/dei/services/shared/hadoop/CDH_7.218/&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 4: Verify Correct Versions&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;&lt;P&gt;cd /data1/informatica/dei/services/shared/hadoop/CDH_7.218&lt;BR /&gt;ls -lh hive*.jar | head -5&lt;/P&gt;&lt;P&gt;# Check the version inside hive-exec.jar&lt;BR /&gt;unzip -p hive-exec-*.jar META-INF/MANIFEST.MF | grep -i version&lt;/P&gt;&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 5: Clear Java Classpath Cache&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;# Remove compiled artifacts&lt;BR /&gt;rm -rf /data1/informatica/dei/tomcat/bin/disTemp/DOM_IDQ_DEV/DIS_DEI_DEV/node02_DEI_DEV/cloudera_dev/SPARK/*&lt;BR /&gt;rm -rf /data1/informatica/dei/tomcat/bin/disTemp/DOM_IDQ_DEV/DIS_DEI_DEV/node02_DEI_DEV/cloudera_dev/HIVE/*&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 6: Restart Informatica Services&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-SPOILER&gt;infaservice.sh dis stop -domain DOM_IDQ_DEV -service DIS_DEI_DEV&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;infaservice.sh dis start -domain DOM_IDQ_DEV -service DIS_DEI_DEV&lt;/LI-SPOILER&gt;&lt;P&gt;&lt;STRONG&gt;Step 7: Verify Hadoop Distribution in Informatica Admin Console&lt;/STRONG&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Login to &lt;STRONG&gt;Informatica Administrator&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;Navigate to &lt;STRONG&gt;DIS_DEI_DEV&lt;/STRONG&gt; → &lt;STRONG&gt;Properties&lt;/STRONG&gt; → &lt;STRONG&gt;Hadoop Connection&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;Click &lt;STRONG&gt;Test Connection&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;If it fails, click &lt;STRONG&gt;Re-import Hadoop Configuration&lt;/STRONG&gt; to refresh&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;Step 8: Re-run Your Mapping&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Happy Hadooping&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 10 Dec 2025 12:21:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413070#M253834</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2025-12-10T12:21:52Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413192#M253918</link>
      <description>&lt;P&gt;Hi again,&lt;BR /&gt;&lt;BR /&gt;We tried re run after updating jars, raised with infa support also, they suggested to roll back old path which&amp;nbsp;&lt;STRONG&gt;CDH 7.2.18 in infa ,&lt;BR /&gt;&lt;BR /&gt;&lt;/STRONG&gt;and follow this KB :&amp;nbsp;&lt;A class="ms-outlook-linkify" title="https://knowledge.informatica.com/s/article/000251302?language=en_US" href="https://knowledge.informatica.com/s/article/000251302?language=en_US" target="_blank" rel="noopener"&gt;https://knowledge.informatica.com/s/article/000251302?language=en_US&lt;/A&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;add these :&lt;BR /&gt;In order to resolve this error add below post sql query in Hive target:&lt;/P&gt;&lt;P&gt;set tez.am.launch.cmd-opts="-XX:+UseNUMA -XX:+UseG1GC -XX:+ResizeTLAB -Xmx20480m \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.net=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.util=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.util.regex=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.lang=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.time=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.io=ALL-UNNAMED \&lt;/P&gt;&lt;P&gt;--add-opens=java.base/java.nio=ALL-UNNAMED";&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 21 Dec 2025 10:15:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413192#M253918</guid>
      <dc:creator>Amr5</dc:creator>
      <dc:date>2025-12-21T10:15:43Z</dc:date>
    </item>
    <item>
      <title>Re: Hive execution stage failing in informatica - after upgrading cloudera cdp cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413199#M253923</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/20288"&gt;@Shelton&lt;/a&gt;&amp;nbsp;still same&amp;nbsp;&lt;BR /&gt;JDK version used in the Hadoop cluster( for both 7.1.9 &amp;amp; 7.3.1 ).&lt;/P&gt;&lt;P&gt;dev - 7.3.1 -&amp;gt; java --version&lt;/P&gt;&lt;P&gt;openjdk 17.0.16 2025-07-15 LTS&lt;BR /&gt;OpenJDK Runtime Environment (Red_Hat-17.0.16.0.8-1) (build 17.0.16+8-LTS)&lt;BR /&gt;OpenJDK 64-Bit Server VM (Red_Hat-17.0.16.0.8-1) (build 17.0.16+8-LTS, mixed mode, sharing)&lt;/P&gt;&lt;P&gt;Prod 7.1.9 -&amp;gt;&amp;nbsp;java -version&lt;BR /&gt;&lt;BR /&gt;openjdk version "1.8.0_462"&lt;BR /&gt;OpenJDK Runtime Environment (build 1.8.0_462-b08)&lt;BR /&gt;OpenJDK 64-Bit Server VM (build 25.462-b08, mixed mode)&lt;/P&gt;&lt;P&gt;&lt;A href="https://supportmatrix.cloudera.com/" target="_blank" rel="noopener"&gt;https://supportmatrix.cloudera.com/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/cdp-private-cloud-base-installation/topics/cdpdc-java-requirements.html" target="_blank" rel="noopener"&gt;https://docs.cloudera.com/cdp-private-cloud-base/7.3.1/cdp-private-cloud-base-installation/topics/cdpdc-java-requirements.html&lt;/A&gt;&lt;BR /&gt;&lt;SPAN&gt;&lt;BR /&gt;From cloudera docs: If you are using JDK 17 on your cluster, you must add the following JVM options to the service:&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN class="cdoc-line"&gt;--add-opens=java.base/java.lang=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-opens=java.management/com.sun.jmx.mbeanserver=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-exports=java.management/com.sun.jmx.mbeanserver=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-exports=java.base/sun.net.dns=ALL-UNNAMED &lt;/SPAN&gt;
&lt;SPAN class="cdoc-line"&gt;--add-exports=java.base/sun.net.util=ALL-UNNAMED&lt;/SPAN&gt;
&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;to ensure the jobs run successfully.&lt;BR /&gt;&lt;BR /&gt;any guidance on this might the issue for job failures.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Dec 2025 08:35:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Hive-execution-stage-failing-in-informatica-after-upgrading/m-p/413199#M253923</guid>
      <dc:creator>Amr5</dc:creator>
      <dc:date>2025-12-25T08:35:10Z</dc:date>
    </item>
  </channel>
</rss>

