Created on 04-09-2018 05:42 PM - edited 08-17-2019 07:48 AM
Cloudbreak enables enterprises to provision Hortonworks platforms in Public (AWS + GCP + Azure) and Private (OpenStack) cloud environments. It simplifies the provisioning, management, and monitoring of on-demand HDP and HDF clusters in virtual and cloud environments.
Following are primary use cases for Cloudbreak:
This article focuses on deploying HDP and HDF cluster on Google Cloud.
You can spin up connected data platform (HDP and HDF clusters) on choice of your cloud vendor using open source Cloudbreak 2.0 which address the following scenarios.
Recently Hortonworks announced the general Availability of the Cloudbreak 2.4 release.
Following are some of the major enhancements in the Cloudbreak 2.4:
You can check the following HCC article for detail overview of Cloudbreak 2.4
https://community.hortonworks.com/articles/174532/overview-of-cloudbreak-240.html
Also check the following article for the Cloudbreak 2.5 tech preview details.
https://community.hortonworks.com/content/kbentry/182293/whats-new-in-cloudbreak-250-tp.html
Article assumes that you have already installed and launch the Cloudbreak instance either on your own custom VM image or on Google Cloud Platform.
You can follow the Cloudbreak documentation which describes both the options.
https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.5.0/content/index.html
https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.5.0/content/gcp-launch/index.html
First step before provisioning cluster is to create the Cloudbreak credential for GCP. Cloudbreak uses this GCP credentials to create the required resources on GCP.
Following are steps to create GCP credential: