I'm having problems getting a real hive context in a spark-scala application (jar) that is running as an Oozie spark action. The spark app does write to hdfs folders just fine. But it is unable to see the same tables that I see in the Hue Hive editor. It seems to be pointing to creating a new metastore somewhere. I have tried to include the hive-site.xml in various places but to no affect. I've tried including it in the following locations:
I have run the code successfully many times in spark-shell. I probably put it incorrectly in one of the locations.
Any thoughts on what I am missing?