Member since
05-13-2016
3
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
10017 | 05-17-2016 12:58 AM |
12-20-2016
06:02 AM
1 Kudo
Found another way of achieving this which also works for PySpark in an Oozie Spark action. Add this to the <spark-opts> tag in the action definition: --conf spark.yarn.appMasterEnv.hive.metastore.uris=thrift://<your-hive-metastore>:9083 This will add the metastore URI to the application master environment and should allow successful connection to Hive for using tables inside a PySpark script.
... View more
05-17-2016
12:58 AM
4 Kudos
I found a solution, even though it is not the prettiest. In the Spark job, before creating the SparkContext, you need to set a system property for the Hive metastore URI like so: System.setProperty("hive.metastore.uris", "thrift://<your metastore host>:9083"); I have tried setting this through the Oozie configuration but to no avail. So far, this was the only way to make it work.
... View more
05-13-2016
05:53 AM
I'm facing exactly the same issue. Trying to run a Spark job that is using a HiveContext from an Oozie Spark action results in the job failing to connect to the Hive metastore. I also tried adding the hive-site.xml in the various mentioned places to no avail. So where would be the right place to configure the Oozie Spark action to play nicely with Hive?
... View more