Created on 01-25-2017 09:47 AM - edited 09-16-2022 03:57 AM
Hi,
I am using CDH 5.9.1. I submitted a spark job which uses a jar compiled with spark 1.6.0. Got following exception :
Exception in thread "main" java.lang.AbstractMethodError at org.apache.spark.Logging$class.log(Logging.scala:50)
So, tried to build that jar with spark 1.6.0-cdh5.9.1:
<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.6.0-cdh5.9.1</version> <scope>provided</scope> </dependency>
It has 4 spark dependencies: spark-core, spark-catalyst, spark-hive and spark-sql. Getting following compilation error:
[ERROR] error: bad symbolic reference. A signature in SQLContext.class refers to term runtime [INFO] in package scala.reflect which is not available. [INFO] It may be completely missing from the current classpath, or the version on [INFO] the classpath might be incompatible with the version used when compiling SQLContext.class.
scala-library 2.10.4 was used. Same issue was seen even changing it to 2.10.5.
Thanks.
Created 02-01-2017 05:26 AM
Though compilation with spark 1.6.0-cdh5.9.1 wasn't successful, my problem got solved. I compiled my code with spark 1.6.0. That compiled successfully but while running, "java.lang.AbstractMethodError" was thrown in Spark logging api. I replaced spark logging with log4j in my code and after that it worked fine.
Created 02-01-2017 05:26 AM
Though compilation with spark 1.6.0-cdh5.9.1 wasn't successful, my problem got solved. I compiled my code with spark 1.6.0. That compiled successfully but while running, "java.lang.AbstractMethodError" was thrown in Spark logging api. I replaced spark logging with log4j in my code and after that it worked fine.