Community Articles

Find and share helpful community-sourced technical articles.
avatar
Expert Contributor

ISSUE: Spark Job fails with "java.lang.LinkageError: ClassCastException: attempting to castjar:file" because of a conflict between RuntimeDelegate from Jersey in yarn client libs and the copy in spark's assembly jar.

ERROR:

17/05/02 17:44:25 ERROR ApplicationMaster: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/u/applic/data/hdfs7/hadoop/yarn/local/filecache/469/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/u/applic/data/hdfs7/hadoop/yarn/local/filecache/469/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/javax/ws/rs/ext/RuntimeDelegate.class
java.lang.LinkageError: ClassCastException: attempting to castjar:file:/u/applic/data/hdfs7/hadoop/yarn/local/filecache/469/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/u/applic/data/hdfs7/hadoop/yarn/local/filecache/469/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/javax/ws/rs/ext/RuntimeDelegate.class
	at javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:116)
	at javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:91)
	at javax.ws.rs.core.MediaType.<clinit>(MediaType.java:44)
	at com.sun.jersey.core.header.MediaTypes.<clinit>(MediaTypes.java:64)
	at com.sun.jersey.core.spi.factory.MessageBodyFactory.initReaders(MessageBodyFactory.java:182)
	at com.sun.jersey.core.spi.factory.MessageBodyFactory.initReaders(MessageBodyFactory.java:175)
	at com.sun.jersey.core.spi.factory.MessageBodyFactory.init(MessageBodyFactory.java:162)
	at com.sun.jersey.api.client.Client.init(Client.java:342)
	at com.sun.jersey.api.client.Client.access$000(Client.java:118)
	at com.sun.jersey.api.client.Client$1.f(Client.java:191)
	at com.sun.jersey.api.client.Client$1.f(Client.java:187)
	at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193)
	at com.sun.jersey.api.client.Client.<init>(Client.java:187)
	at com.sun.jersey.api.client.Client.<init>(Client.java:170)
	at org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.serviceInit(TimelineClientImpl.java:282)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.hive.ql.hooks.ATSHook.<init>(ATSHook.java:67)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at java.lang.Class.newInstance(Class.java:379)
	at org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:60)
	at org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1309)
	at org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1293)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1347)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:495)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290)
	at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237)
	at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236)
	at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:279)
	at org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:484)
	at org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:474)
	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:624)
	at org.apache.spark.sql.hive.execution.DropTable.run(commands.scala:89)
	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
	at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
	at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
	at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
	at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
	at com.ao.multiLevelLoyalty$.main(multiLevelLoyalty.scala:846)
	at com.ao.multiLevelLoyalty.main(multiLevelLoyalty.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
17/05/02 17:44:25 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/u/applic/data/hdfs7/hadoop/yarn/local/filecache/469/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/u/applic/data/hdfs7/hadoop/yarn/local/filecache/469/spark-assembly-1.6.2.2.5.3.0-37-hadoop2.7.3.2.5.3.0-37.jar!/javax/ws/rs/ext/RuntimeDelegate.class)
17/05/02 17:44:25 INFO SparkContext: Invoking stop() from shutdown hook
17/05/02 17:44:25 INFO SparkUI: Stopped Spark web UI at http://10.225.135.102:35023
17/05/02 17:44:25 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
17/05/02 17:44:25 INFO YarnClusterSchedulerBackend: Shutting down all executors
17/05/02 17:44:25 INFO YarnClusterSchedulerBackend: Asking each executor to shut down

ROOT CAUSE:

This happens because of the conflict between RuntimeDelegate from Jersey in yarn client libs and the copy in spark's assembly jar. At runtime, YARN call into ATS code which needs a different version of a class and cannot find it because the version in Spark and the version in YARN have a conflict.

RESOLUTION:

Set below property using HiveContext:

hc = new org.apache.spark.sql.hive.HiveContext(sc)
hc.setConf("yarn.timeline-service.enabled","false")
7,628 Views
Comments

I tried adding the above conf as directed, issue persists only change is now its trying to cast from yarn assembly jar to yarn assembly JAR.

please let me know if I am missing here something or any configuration changes are required.

ERROR ApplicationMaster: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/hadoop/hadoop/yarn/local/filecache/11415/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/hadoop/hadoop/yarn/local/filecache/11415/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class java.lang.LinkageError: ClassCastException: attempting to castjar:file:/hadoop/hadoop/yarn/local/filecache/11415/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/hadoop/hadoop/yarn/local/filecache/11415/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class

Hello,

I did as directed above

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

sqlContext.setConf("yarn.timeline-service.enabled","false")

now I have ended up with a new issue which is very similar to earlier one, but now its trying to cast from yarn client to yarn client,I dont see spark's assembly jar in error .

let me know if any configuration changes are required .

I am trying to run in cluster mode

hdp 2.4.2.0-258

diagnostics: User class threw exception: java.lang.LinkageError: ClassCastException: attempting to castjar:file:/hadoop/hadoop/yarn/local/filecache/10020/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/hadoop/hadoop/yarn/local/filecache/10020/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class
Exception in thread "main" org.apache.spark.SparkException: Application application_1499441914050_13463 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1092) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1139) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/07/13 12:38:12 INFO ShutdownHookManager: Shutdown hook called