Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Akka http exception on Spark

Solved Go to solution

Akka http exception on Spark

New Contributor

Hi, 

 

I'm trying to run Akka http on Spark. But it raises exceptions with NoSuchMethodError. The CDH version is 5.7.1

 

Here's my demo using spark-shell.

 

$ spark-shell --packages  com.typesafe.akka:akka-http-experimental_2.10:1.0

 

 

import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import akka.http.scaladsl.Http
import akka.http.scaladsl.server.Directives._

implicit val actorSystem = ActorSystem("system")
implicit val actorMaterializer = ActorMaterializer()

java.lang.NoSuchMethodError: com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
        at akka.stream.StreamSubscriptionTimeoutSettings$.apply(ActorMaterializer.scala:346)
        at akka.stream.ActorMaterializerSettings$.apply(ActorMaterializer.scala:229)
        at akka.stream.ActorMaterializerSettings$.apply(ActorMaterializer.scala:215)
        at akka.stream.ActorMaterializer$$anonfun$1.apply(ActorMaterializer.scala:37)
        at akka.stream.ActorMaterializer$$anonfun$1.apply(ActorMaterializer.scala:37)
        at scala.Option.getOrElse(Option.scala:120)
        at akka.stream.ActorMaterializer$.apply(ActorMaterializer.scala:37)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:56)
        at $iwC$$iwC$$iwC.<init>(<console>:58)
        at $iwC$$iwC.<init>(<console>:60)
        at $iwC.<init>(<console>:62)
        at <init>(<console>:64)
        at .<init>(<console>:68)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

 However, I could run my demo locally without Spark, same code and same Akka package. 

 

sbt build setting:

scalaVersion := "2.10.5"

libraryDependencies ++= Seq(
  "com.typesafe.akka" % "akka-http-experimental_2.10" % "1.0",
  "com.typesafe.akka" % "akka-http-spray-json-experimental_2.10" % "1.0",
  "com.typesafe.akka" %"akka-http-testkit-experimental_2.10" % "1.0",
  "org.scalatest" %% "scalatest" % "2.2.5" % "test"
)

 

Any ideas would be greatly appreciated!

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Akka http exception on Spark

Expert Contributor

Spark includes akka, and it looks like you are getting a version mismatch.  You will need to look into the versions, changing the version of akka you are trying to use with what is used within Spark and possibly shade dependencies so the correct versions are used.

1 REPLY 1

Re: Akka http exception on Spark

Expert Contributor

Spark includes akka, and it looks like you are getting a version mismatch.  You will need to look into the versions, changing the version of akka you are trying to use with what is used within Spark and possibly shade dependencies so the correct versions are used.