I added a config property called spark.local.dir, and this
seemed to resolve this issue below, and I can select from tables when
connecting through port 10015 in beeline.
I set it to /tmp, as it needs to be writeable by the spark
process. I tried making a sub-directory called /tmp/spark-tmp, and change
ownership to spark:hadoop but it didn’t like it for some reason. Maybe because
it wasn’t executable.
From /var/log/spark/spark-hive-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-mdzusvpclhdp001.mdz.local.out:
16/08/03 14:58:14 ERROR DiskBlockManager: Failed to create local
dir in /tmp/spark-tmp. Ignoring this directory.
java.io.IOException: Failed to create a temp directory (under
/tmp/spark-tmp) after 10 attempts!