Support Questions

Find answers, ask questions, and share your expertise

Exception in thread "main" java.lang.NoClassDefFoundError: com/typesafe/config/ConfigFactory


Hi built a spark application which run alright on local mode but when I run on hortonworks sandbox it throws back the error message at the bottom

- I included the dependencies in my build.sbt file.

- imported typesafe on my application => import com.typesafe.config._

package retail
import org.apache.spark.SparkContext, org.apache.spark.SparkConf
import com.typesafe.config._
import org.apache.hadoop.fs._

object AvgRevenueDaily { def main (args: Array[String]){

val appConf = ConfigFactory.load()
val conf = new SparkConf(). setAppName("Average Revenue - Daily"). setMaster(appConf.getConfig(args(2)).getString("deploymentMaster"))

val sc = new SparkContext(conf) val inputPath = args(0) val outputPath = args(1)

Exception in thread "main" java.lang.NoClassDefFoundError: com/typesafe/config/ConfigFactory at retail.AvgRevenueDaily$.main(AvgRevenueDaily.scala:11) at retail.AvgRevenueDaily.main(AvgRevenueDaily.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( at sun.reflect.DelegatingMethodAccessorImpl.invoke( at java.lang.reflect.Method.invoke( at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:782) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.typesafe.config.ConfigFactory at at java.lang.ClassLoader.loadClass( at java.lang.ClassLoader.loadClass( ... 11 more



and yes src/main/resources is also added to java build path. any help PLEASEEEEE!!!!!

New Contributor


Can you please add --jars config-<version>.jar to your spark-submit?

You can find the jar in this path: .ivy2>cache>com.typesafe>config>bundles



I ran into the same error. Even though the dependencies are listed in sbt, the jars have to be specifically shipped with --jars option in spark-submit. Why is this needed? 


Any workarounds?