Member since
12-17-2015
7
Posts
3
Kudos Received
0
Solutions
02-26-2016
12:19 PM
Hey Craig- Spark's HiveContext requires the use of *some* metastore. In this case, since you're not specifying one, it's creating the default, file-based metastore_db. Here's some more details: https://github.com/apache/spark/blob/99dfcedbfd4c83c7b6a343456f03e8c6e29968c5/examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala#L42 http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables Few options: 1) make sure the location is writable by your Spark processes 2) configure the hive-site.xml to place the file in a diff location 3) move to MySQL or equivalent for true metastore functionality (might be needed elsewhere)
... View more