Support Questions

Find answers, ask questions, and share your expertise

Spark job not able to find hive table ,though the table exist in Hive

avatar
Cloudera Employee

I am running below Spark command Spark Command: spark-submit --master yarn --deploy-mode cluster --class com.hpe.eap.batch.EAPDataRefinerMain --num-executors 2 --executor-cores 1 --executor-memory 1g --driver-memory 2g --jars application.json,/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar, eap-spark-refiner-1.0.jar --files /etc/spark/conf/hive-site.xml

I am getting error as below. ERROR ApplicationMaster: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class

java.lang.LinkageError: ClassCastException: attempting to castjar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/data19/hadoop/yarn/local/filecache/79/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class at javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:116) at javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:91)

1 ACCEPTED SOLUTION

avatar
Rising Star

@npandey

This can happen due to conflict in RuntineDelegate from Jersey in yarn client libs and the copy in spark's assembly jar. Please refer to below article for more information.

https://community.hortonworks.com/articles/101145/spark-job-failure-with-javalanglinkageerror-classc...

Also, note that hive-site.xml should contain only Spark related properties like metastore information. You can download this for spark job from download client configs option in Ambari. Passing the complete file(/etc/hive/conf/hive-site.xml) may have ATS related related properties which can also cause this issue.

View solution in original post

3 REPLIES 3

avatar

What version of Spark, Hive and Yarn are you using ?

avatar

@npandey

Your spark job is failing due to LinkageError, this usually happens when there is conflict between RuntimeDelegate from Jersey in yarn client libs and the copy in spark's assembly jar.

At runtime, YARN call into ATS code which needs a different version of a class and cannot find it because the version in Spark and the version in YARN have a conflict.

To resolve this, set below property using HiveContext:

hc =new org.apache.spark.sql.hive.HiveContext(sc);

hc.setConf("yarn.timeline-service.enabled","false")

As always, if this answer helps you, please consider accepting it.

avatar
Rising Star

@npandey

This can happen due to conflict in RuntineDelegate from Jersey in yarn client libs and the copy in spark's assembly jar. Please refer to below article for more information.

https://community.hortonworks.com/articles/101145/spark-job-failure-with-javalanglinkageerror-classc...

Also, note that hive-site.xml should contain only Spark related properties like metastore information. You can download this for spark job from download client configs option in Ambari. Passing the complete file(/etc/hive/conf/hive-site.xml) may have ATS related related properties which can also cause this issue.