First of all, I notice a suspect ";" trailing the "expected" Filesystem where Spark is trying to read/write (maybe pointing to wrong Hive Metastore?).
In fact, it's also strange that it's expecting me to tell it to read/write on "Cluster 1" itself, differently from what I specified in my Spark session Hive Context configuration (fs.defaultFS parameter inside the Script)
More than this, even if I try setting the following parameters I can't make it work. And it's strange, because as I said if I use Spark 1.6 everything runs smoothly (even without the following additional configurations):