Support Questions
Find answers, ask questions, and share your expertise

Spark Jar caching?

Spark Jar caching?


I'm running an application with spark-submit. The application uses both Scala and Java. The spark-submit specifies the location of the jar file with --jars


A strange phenomenon I'm seeing - even though I make modifications to my Java files and build new jar files, the cluster sometimes uses my older jar files. It is as if the cluster has a cached copy of my old jar file. 


Can someone please educate me on where to look for older or cached jar files and clean them up?


ps: I'm using Cloudera 5.5.1, with Spark 1.5.0