Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark Settings with Ambari: Can't find corresponding config files on host

avatar
Expert Contributor

I need to set some custom Spark configurations (e.g. hive.metastore.sasl.enabled in hive-site.xml). The file /usr/hdp/current/spark-client/conf/hive-site.xml seems to be the corresponding config file for this settings. It also contains the property:

<configuration>
   <property>
     <name>hive.metastore.sasl.enabled</name>
     <value>true</value>
   </property>
...
</configuration>

The problem that I have, is that I can't find this property in my Spark Settings in Ambari UI! I need to set this (and some more) property via Ambari UI, to avoid a loss of these configurations after restarting the Spark component (a restart resets the config files according to the Ambari settings. And as I can't see this setting on Ambari UI I have no chance to set this persistent). How are these files and Ambari configs matched?

And is there a possibility to set these configurations programmatically (Java) without using stuff like SparkContext or HiveContext, but e.g. System.setProperty(...)? Thank you!

1 ACCEPTED SOLUTION

avatar
Expert Contributor

So, however the solution to change Spark's hive-site.xml file persistently (survive Spark service restarts) is to find them in Ambari Hive Configs and change them here. After restarting the Hive and afterwards the Spark component via Ambari, the Spark config file /usr/hdp/current/spark-client/conf/hive-site.xml is also updated from these values set in the Ambari Hive configs. Thank you @Doroszlai, Attila! See also the comment above: https://community.hortonworks.com/answers/98279/view.html

View solution in original post

4 REPLIES 4

avatar
Super Collaborator

@Daniel Müller this is a general comment so it might not help, but there's a set of custom property boxes under the Spark service Configs tab, including one called Custom spark-hive-site-override. The Spark guide describes a similar custom property step (for doAs support) on the following page in the Spark guide, under the Ambari subsection:

http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.0/bk_spark-component-guide/content/config-sts-...

avatar
Explorer

@Daniel Müller look for these properties (the ones present in hive-site.xml) on Hive's config page, not Spark, within Ambari.

avatar
Expert Contributor

I was able to find all properties from Spark's hive-site.xml here.

What I'm wondering about now is why do I have two different files

  • /etc/hive/conf/hive-site.xml
  • /usr/hdp/current/spark-client/conf/hive-site.xml

when both contain some equal properties e.g.

<property>
   <name>hive.server2.authentication.kerberos.principal</name>
   <value>hive/_HOST@MYREALM.COM</value>
 </property>

and are also both changed when I edit the property via Ambari Hive config (as the Ambari Spark configs don't list them)?

avatar
Expert Contributor

So, however the solution to change Spark's hive-site.xml file persistently (survive Spark service restarts) is to find them in Ambari Hive Configs and change them here. After restarting the Hive and afterwards the Spark component via Ambari, the Spark config file /usr/hdp/current/spark-client/conf/hive-site.xml is also updated from these values set in the Ambari Hive configs. Thank you @Doroszlai, Attila! See also the comment above: https://community.hortonworks.com/answers/98279/view.html