<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Spark job not able to find hive table ,though the table exist in Hive in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218187#M180088</link>
    <description>&lt;P&gt;I am running below Spark command
&lt;STRONG&gt;&lt;/STRONG&gt;&lt;STRONG&gt;Spark Command:&lt;/STRONG&gt;
spark-submit --master yarn --deploy-mode cluster --class com.hpe.eap.batch.EAPDataRefinerMain --num-executors 2 --executor-cores 1 --executor-memory 1g --driver-memory 2g --jars application.json,/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar, eap-spark-refiner-1.0.jar --files /etc/spark/conf/hive-site.xml&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;I am getting error as below.
&lt;/STRONG&gt;ERROR ApplicationMaster: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class&lt;/P&gt;&lt;P&gt;java.lang.LinkageError: ClassCastException: attempting to castjar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class
at &lt;A href="http://javax.ws.rs/"&gt;javax.ws.rs&lt;/A&gt;.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:116)
at &lt;A href="http://javax.ws.rs/"&gt;javax.ws.rs&lt;/A&gt;.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:91)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 29 Jun 2017 21:00:56 GMT</pubDate>
    <dc:creator>npandey</dc:creator>
    <dc:date>2017-06-29T21:00:56Z</dc:date>
    <item>
      <title>Spark job not able to find hive table ,though the table exist in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218187#M180088</link>
      <description>&lt;P&gt;I am running below Spark command
&lt;STRONG&gt;&lt;/STRONG&gt;&lt;STRONG&gt;Spark Command:&lt;/STRONG&gt;
spark-submit --master yarn --deploy-mode cluster --class com.hpe.eap.batch.EAPDataRefinerMain --num-executors 2 --executor-cores 1 --executor-memory 1g --driver-memory 2g --jars application.json,/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar, eap-spark-refiner-1.0.jar --files /etc/spark/conf/hive-site.xml&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;I am getting error as below.
&lt;/STRONG&gt;ERROR ApplicationMaster: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class&lt;/P&gt;&lt;P&gt;java.lang.LinkageError: ClassCastException: attempting to castjar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class
at &lt;A href="http://javax.ws.rs/"&gt;javax.ws.rs&lt;/A&gt;.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:116)
at &lt;A href="http://javax.ws.rs/"&gt;javax.ws.rs&lt;/A&gt;.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:91)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 29 Jun 2017 21:00:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218187#M180088</guid>
      <dc:creator>npandey</dc:creator>
      <dc:date>2017-06-29T21:00:56Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job not able to find hive table ,though the table exist in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218188#M180089</link>
      <description>&lt;P&gt;What version of Spark, Hive and Yarn are you using ?&lt;/P&gt;</description>
      <pubDate>Thu, 29 Jun 2017 21:42:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218188#M180089</guid>
      <dc:creator>dineshc</dc:creator>
      <dc:date>2017-06-29T21:42:20Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job not able to find hive table ,though the table exist in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218189#M180090</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/19007/npandey.html" nodeid="19007"&gt;@npandey&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/19007/npandey.html" nodeid="19007"&gt;&lt;/A&gt;Your spark job is failing due to LinkageError, this usually happens when there is conflict between RuntimeDelegate from Jersey in yarn client libs and the copy in spark's assembly jar.&lt;/P&gt;&lt;P&gt;At runtime, YARN call into ATS code which needs a different version of a class and cannot find it because the version in Spark and the version in YARN have a conflict.&lt;/P&gt;&lt;P&gt;To resolve this, set below property using HiveContext:&lt;/P&gt;&lt;PRE&gt;hc =new org.apache.spark.sql.hive.HiveContext(sc);

hc.setConf("yarn.timeline-service.enabled","false")&lt;/PRE&gt;&lt;P&gt;As always, if this answer helps you, please consider accepting it.&lt;/P&gt;</description>
      <pubDate>Thu, 29 Jun 2017 21:47:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218189#M180090</guid>
      <dc:creator>dineshc</dc:creator>
      <dc:date>2017-06-29T21:47:44Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job not able to find hive table ,though the table exist in Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218190#M180091</link>
      <description>&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/19007/npandey.html"&gt;@npandey&lt;/A&gt;&lt;/P&gt;&lt;P&gt;This can happen due to conflict in RuntineDelegate from Jersey in yarn client libs and the copy in spark's assembly jar. Please refer to below article for more information. &lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/articles/101145/spark-job-failure-with-javalanglinkageerror-classc.html" target="_blank"&gt;https://community.hortonworks.com/articles/101145/spark-job-failure-with-javalanglinkageerror-classc.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;
Also, note that hive-site.xml should contain only Spark related properties like metastore information. You can download this for spark job from download client configs option in Ambari.
Passing the complete file(/etc/hive/conf/hive-site.xml) may have ATS related related properties which can also cause this issue.&lt;/P&gt;</description>
      <pubDate>Thu, 29 Jun 2017 22:11:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-not-able-to-find-hive-table-though-the-table-exist/m-p/218190#M180091</guid>
      <dc:creator>prsingh1</dc:creator>
      <dc:date>2017-06-29T22:11:29Z</dc:date>
    </item>
  </channel>
</rss>

