Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Configure Spark 2 to Use Existing Hive Metastore

Highlighted

Configure Spark 2 to Use Existing Hive Metastore

I just installed Spark 2 on my cluster that is managed by Cloudera Manager. I can run the canary test and the SparkPi job using YARN as the application master. However, when try and access the Hive metastore I get a "connection refused" error. I suspect that I need to configure Spark 2 with the location of the metastore, but I'm not sure how to do that: there doesn't seem to be a setting in Cloudera Manager, and I'm reluctant to modify any of the configuration files without further guidance. Can anyone help?

 

Thanks,

 

David

1 REPLY 1

Re: Configure Spark 2 to Use Existing Hive Metastore

Explorer

Hi,

 

    Did you check 'Hive' for 'Hive Service' parameter in Spark configuration? You can do that in Cloudera Manager UI. By default it is marked as 'none'.

 

Regards,

 

Bart