Created 08-21-2017 05:10 PM
Hi,
We want to use external hive metastore that is backed by Amazon RDS in cloudbreak. In cloudbreak UI -> Show Advanced Options -> we can configure DB properties for Ambari but not sure about hive.
Can we configure the properties(javax.jdo.option.ConnectionURL, javax.jdo.option.ConnectionUserName, javax.jdo.option.ConnectionPassword) for external hive metastore in blueprint.json (hive-site.xml) or any other way we can configure it? Do we have any documentation on how to configure Amazon RDS in cloudbreak?
As per the above link, we can add environment variables in cloudbreak deployer's Profile. But if we are putting the variables in Profile file then will it apply only for Hive Service?
Thanks.
Created 08-21-2017 07:46 PM
Unfortunatly this is currently not possible with Cloudbreak UI only if you configure it in your blueprint and then you are using that blueprint when you create a cluster.
Try Hortonworks Data Cloud because that already knows this feature: https://hortonworks.github.io/hdp-aws/create/index.html#hive-metastore
Br,
R
Created 08-21-2017 07:46 PM
Unfortunatly this is currently not possible with Cloudbreak UI only if you configure it in your blueprint and then you are using that blueprint when you create a cluster.
Try Hortonworks Data Cloud because that already knows this feature: https://hortonworks.github.io/hdp-aws/create/index.html#hive-metastore
Br,
R
Created 08-22-2017 02:30 AM
Thanks for your response. At this moment, we are not using HDC. We will try to configure the properties in blueprint and give it a try. When we launch the cluster using Ambari, we can test the connection for external DB's for services like hive, oozie etc. but not sure how we can do this in Cloudbreak.
Created on 08-22-2017 11:29 AM - edited 08-17-2019 06:32 PM
@rdoktorics @Sonu Sahi @rkovacs
We are able to configure the properties in blueprint and launch HDP cluster but we came across an issue where hive metastore was not able to start. When we checked the hive configuration, we found that all the external DB properties was there for Hive metastore but Hive Database was pointing to "New MySQL Database" which supposed to be "Existing MySQL / MariaDB Database".
After changing it manually and running the command "ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar", we were successfully able to start the hive metastore.
Is there any way we can choose "Existing MySQL / MariaDB Database" during the launch of HDP cluster so that it will consider external DB as Hive Metastore.
Thanks.
Created 08-22-2017 01:10 PM
Hi All,
There was misconfiguration in our blueprint file which we rectified and fixed it. After that we were able to launch HDP cluster and start all hadoop services using Cloudbreak.
Thanks all..
Created 08-26-2017 05:41 PM
@shyam Shaw. Need one help.. , I was trying to add the configuration in the blueprint but it failed , wanted to know where do we need the configuration for the rds external metastore (properties like below).
javax.jdo.option.ConnectionDriverName
javax.jdo.option.ConnectionURL
javax.jdo.option.ConnectionUserName
javax.jdo.option.ConnectionPassword.
Do we need to add them on node configuration where we have hive_server installed or at the end where blueprint name is defined.
Thanks
Created on 08-28-2017 06:04 AM - edited 08-17-2019 06:32 PM
Created 09-20-2017 06:49 AM
@Shyam Shaw Thanks much for the response. It works perfectly fine. Have you worked on Hortonworks datacloud HDC ?
I wanted to configure external hive metastore there by passing the same configuration in JSON format as above, but wasn't able to spin the cluster successfully . It is failing with error 'Ambari Blueprint could not be added: Configuration Maps must hold a single configuration type each'.
Note:- I am able to configure external metastore(RDS -Mysql) via Ambari interface, but I want to do it while spinning HDC cluster.
Thanks
Created 09-20-2017 12:43 PM
I haven't configured external hive metastore on HDC. Have you gone through the below article by @Dominika Bialek which explained "How to set up shared RDS as Hive or Druid metastore on Hortonworks Data Cloud for AWS".