Support Questions

Find answers, ask questions, and share your expertise

Specify RDS for Hive Metastore in Cloudbreak

avatar
Expert Contributor

Hi,

We want to use external hive metastore that is backed by Amazon RDS in cloudbreak. In cloudbreak UI -> Show Advanced Options -> we can configure DB properties for Ambari but not sure about hive.

Can we configure the properties(javax.jdo.option.ConnectionURL, javax.jdo.option.ConnectionUserName, javax.jdo.option.ConnectionPassword) for external hive metastore in blueprint.json (hive-site.xml) or any other way we can configure it? Do we have any documentation on how to configure Amazon RDS in cloudbreak?

https://community.hortonworks.com/questions/67731/cloudbreak-for-aws-and-specifying-rds-for-hivemeta...

As per the above link, we can add environment variables in cloudbreak deployer's Profile. But if we are putting the variables in Profile file then will it apply only for Hive Service?

Thanks.

1 ACCEPTED SOLUTION

avatar
Super Collaborator

@Shyam Shaw

Unfortunatly this is currently not possible with Cloudbreak UI only if you configure it in your blueprint and then you are using that blueprint when you create a cluster.

Try Hortonworks Data Cloud because that already knows this feature: https://hortonworks.github.io/hdp-aws/create/index.html#hive-metastore

Br,

R

View solution in original post

8 REPLIES 8

avatar
Super Collaborator

@Shyam Shaw

Unfortunatly this is currently not possible with Cloudbreak UI only if you configure it in your blueprint and then you are using that blueprint when you create a cluster.

Try Hortonworks Data Cloud because that already knows this feature: https://hortonworks.github.io/hdp-aws/create/index.html#hive-metastore

Br,

R

avatar
Expert Contributor

@rdoktorics

Thanks for your response. At this moment, we are not using HDC. We will try to configure the properties in blueprint and give it a try. When we launch the cluster using Ambari, we can test the connection for external DB's for services like hive, oozie etc. but not sure how we can do this in Cloudbreak.

avatar
Expert Contributor

@rdoktorics @Sonu Sahi @rkovacs

We are able to configure the properties in blueprint and launch HDP cluster but we came across an issue where hive metastore was not able to start. When we checked the hive configuration, we found that all the external DB properties was there for Hive metastore but Hive Database was pointing to "New MySQL Database" which supposed to be "Existing MySQL / MariaDB Database".

After changing it manually and running the command "ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar", we were successfully able to start the hive metastore.

Is there any way we can choose "Existing MySQL / MariaDB Database" during the launch of HDP cluster so that it will consider external DB as Hive Metastore.

Thanks.

34399-cloudbreak.png

avatar
Expert Contributor

Hi All,

There was misconfiguration in our blueprint file which we rectified and fixed it. After that we were able to launch HDP cluster and start all hadoop services using Cloudbreak.

Thanks all..

avatar
New Contributor

@shyam Shaw. Need one help.. , I was trying to add the configuration in the blueprint but it failed , wanted to know where do we need the configuration for the rds external metastore (properties like below).

javax.jdo.option.ConnectionDriverName

javax.jdo.option.ConnectionURL

javax.jdo.option.ConnectionUserName

javax.jdo.option.ConnectionPassword.

Do we need to add them on node configuration where we have hive_server installed or at the end where blueprint name is defined.

Thanks

avatar
Expert Contributor

Hi @Shaun Michael

You need to add the properties at the beginning of the JSON file where you are defining configuration for hadoop services. Below is the snippet from the json file which we used in our configuration. Make sure to validate the JSON before using it in Cloudbreak.

34504-rds-configuration.png

avatar
New Contributor

@Shyam Shaw Thanks much for the response. It works perfectly fine. Have you worked on Hortonworks datacloud HDC ?

I wanted to configure external hive metastore there by passing the same configuration in JSON format as above, but wasn't able to spin the cluster successfully . It is failing with error 'Ambari Blueprint could not be added: Configuration Maps must hold a single configuration type each'.

Note:- I am able to configure external metastore(RDS -Mysql) via Ambari interface, but I want to do it while spinning HDC cluster.

Thanks

avatar
Expert Contributor

Hi @Shaun Michael

I haven't configured external hive metastore on HDC. Have you gone through the below article by @Dominika Bialek which explained "How to set up shared RDS as Hive or Druid metastore on Hortonworks Data Cloud for AWS".