Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Bad Symbolic reference on SQLContext.class when compiling with spark 1.6.0-cdh5.9.1

avatar
New Contributor

Hi,

     I am using CDH 5.9.1. I submitted a spark job which uses a jar compiled with spark 1.6.0. Got following exception :

 

 

 

Exception in thread "main" java.lang.AbstractMethodError
        at org.apache.spark.Logging$class.log(Logging.scala:50)

 

So, tried to build that jar with spark 1.6.0-cdh5.9.1:

 

 

<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.6.0-cdh5.9.1</version>
    <scope>provided</scope>
   </dependency>

 

 

It has 4 spark dependencies: spark-core, spark-catalyst, spark-hive and spark-sql. Getting following compilation error:

[ERROR] error: bad symbolic reference. A signature in SQLContext.class refers to term runtime
[INFO] in package scala.reflect which is not available.
[INFO] It may be completely missing from the current classpath, or the version on
[INFO] the classpath might be incompatible with the version used when compiling SQLContext.class.

scala-library 2.10.4 was used. Same issue was seen even changing it to 2.10.5.

 

Thanks.

1 ACCEPTED SOLUTION

avatar
New Contributor

Though compilation with spark 1.6.0-cdh5.9.1 wasn't successful, my problem got solved. I compiled my code with spark 1.6.0. That compiled successfully but while running, "java.lang.AbstractMethodError" was thrown in Spark logging api. I replaced spark logging with log4j in my code and after that it worked fine.

View solution in original post

1 REPLY 1

avatar
New Contributor

Though compilation with spark 1.6.0-cdh5.9.1 wasn't successful, my problem got solved. I compiled my code with spark 1.6.0. That compiled successfully but while running, "java.lang.AbstractMethodError" was thrown in Spark logging api. I replaced spark logging with log4j in my code and after that it worked fine.