I am working on a system where I let users write DSLS and I let them load it as instances of my type during runtime and these can be applied on top of RDDs. The entire application runs as a spark-submit application and I use ScriptEngine engine to compile DSLs written in Scala itself. Every tests works fine in SBT and IntelliJ. But while doing a spark-submit my own types available in my fat-jar is not available to import in Script. I initialize script engine as follows. val engine: ScriptEngine = new ScriptEngineManager().getEngineByName("scala") private val settings: Settings = engine.asInstanceOf[scala.tools.nsc.interpreter.IMain].settings settings.usejavacp.value = true
settings.embeddedDefaults[DummyClass] private val loader: ClassLoader = Thread.currentThread().getContextClassLoader settings.embeddedDefaults(loader)
It seems like this is a problem with classloader during spark-submit. But I am not able to figure out the reason why my own types in my jar which also has the main program for spark-submit is unavailable in my script which is created in same JVM. scala scala-compiler,scala-reflect and scala-library versions are 2.11.8.