Created 03-28-2018 06:36 PM
Hi built a spark application which run alright on local mode but when I run on hortonworks sandbox it throws back the error message at the bottom
- I included the dependencies in my build.sbt file.
- imported typesafe on my application => import com.typesafe.config._
package retail
import org.apache.spark.SparkContext, org.apache.spark.SparkConf
import com.typesafe.config._
import org.apache.hadoop.fs._
object AvgRevenueDaily {
def main (args: Array[String]){
val appConf = ConfigFactory.load()
val conf = new SparkConf().
setAppName("Average Revenue - Daily").
setMaster(appConf.getConfig(args(2)).getString("deploymentMaster"))
val sc = new SparkContext(conf)
val inputPath = args(0)
val outputPath = args(1)
Exception in thread "main" java.lang.NoClassDefFoundError: com/typesafe/config/ConfigFactory at retail.AvgRevenueDaily$.main(AvgRevenueDaily.scala:11) at retail.AvgRevenueDaily.main(AvgRevenueDaily.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:782) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.typesafe.config.ConfigFactory at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 11 more
Created 03-28-2018 06:39 PM
and yes src/main/resources is also added to java build path. any help PLEASEEEEE!!!!!
Created 07-09-2019 02:57 PM
Hi!
Can you please add --jars config-<version>.jar to your spark-submit?
You can find the jar in this path: .ivy2>cache>com.typesafe>config>bundles
Thanks!
Created 10-14-2019 01:21 PM
I ran into the same error. Even though the dependencies are listed in sbt, the jars have to be specifically shipped with --jars option in spark-submit. Why is this needed?
Any workarounds?