I saw Hadoop load native libraries in the running user's home onto the classpath. Maybe the same thing is happening to you with Spark. Check your home for ls ~/lib*
libhadoop.a libhadoop.so libhadooputils.a libsnappy.so libsnappy.so.1.1.3
libhadooppipes.a libhadoop.so.1.0.0 libhdfs.a libsnappy.so.1 and delete them if found. I could be totally off, but this was the culprit in our case.
... View more