We are aware that we can create multiple Hive servers & multiple Hive Metastores to configure highly available services within a cluster. We would like enable multiple application instances to share a single HDFS cluster, but each instance of the application requires its own Hive database.
Is there a way to configure multiple independent Hive Servers/Metastores within a cluster so that each application can leverage the data in the cluster?
It is not possible to point to multiple Hive Metastore service or Hiveserver2 to different databases under Ambari managed cluster/Cloudera Managed Servers.
One way to achieve this is having non-Ambari managed Hive process with different set of configuration files as below:
su hive -l -c 'HIVE_CONF_DIR=/etc/hive1/conf/conf.server /usr/hdp/current/hive-server2/bin/hiveserver2 -hiveconf hive.metastore.uris=" " -hiveconf hive.log.dir=/var/log/hive1 -hiveconf hive.log.file=hiveserver2.log 1>/var/log/hive1/hiveserver2.log 2>/var/log/hive1/hiveserver2.log &'
su hive -l -c 'HIVE_CONF_DIR=/etc/hive2/conf/conf.server /usr/hdp/current/hive-server2/bin/hiveserver2 -hiveconf hive.metastore.uris=" " -hiveconf hive.log.dir=/var/log/hive2 -hiveconf hive.log.file=hiveserver2.log 1>/var/log/hive2/hiveserver2.log 2>/var/log/hive2/hiveserver2.log &'