Community Articles
Find and share helpful community-sourced technical articles.
Labels (1)

Cloudbreak Overview


Cloudbreak enables enterprises to provision Hortonworks platforms in Public (AWS + GCP + Azure) and Private (OpenStack) cloud environments. It simplifies the provisioning, management, and monitoring of on-demand HDP and HDF clusters in virtual and cloud environments.

Following are primary use cases for Cloudbreak:

  • Dynamically configure and manage clusters on public or private clouds.
  • Seamlessly manage elasticity requirements as cluster workloads change
  • Supports configuration defining network boundaries and configuring security groups.

This article focuses on deploying HDP and HDF cluster on Google Cloud.

Cloudbreak Benefits

You can spin up connected data platform (HDP and HDF clusters) on choice of your cloud vendor using open source Cloudbreak 2.0 which address the following scenarios.

  • Defining the comprehensive Data Strategy irrespective of deployment architecture (cloud or on premise).
  • Addressing the Hybrid (on-premise & cloud) requirements.
  • Supporting the key Multi-cloud approach requirements.
  • Consistent and familiar security and governance across on-premise and cloud environments.

Cloudbreak 2 Enhancements

Recently Hortonworks announced the general Availability of the Cloudbreak 2.4 release.

Following are some of the major enhancements in the Cloudbreak 2.4:

  • New UX / UI: a greatly simplified and streamlined user experience.
  • New CLI: a new CLI that eases automation, an important capability for cloud DevOps.
  • Custom Images: advanced support for “bring your own image”, a critical feature to meet enterprise infrastructure requirements.
  • Kerberos: ability to enable Kerberos security on your clusters, must for any enterprise deployment.

You can check the following HCC article for detail overview of Cloudbreak 2.4

Also check the following article for the Cloudbreak 2.5 tech preview details.

Prerequisites for Google Cloud Platform.

Article assumes that you have already installed and launch the Cloudbreak instance either on your own custom VM image or on Google Cloud Platform.

You can follow the Cloudbreak documentation which describes both the options.

  • In order to launch the Cloudbreak and provision the clusters make sure you have the Google cloud account. You can create one at
  • Create new project in GCP (e.g. GCPIntegration project as shown below).


  • In order to launch the clusters on GCP you must have service account that Cloudbreak can use. Assign the admin roles for the Compute Engine and Storage. You can check the required service account admin roles at Admin Roles
  • Make sure you create the P12 key and store it safely.


  • This article assumes that you have successfully meet the prereqs and able to launch the cloudbreak UI as shown left below by visiting https://<IP_Addr or HostName> and Upon successful login you are redirected to the dashboard which looks like the image on right.


Create Cloudbreak Credential for GCP.

First step before provisioning cluster is to create the Cloudbreak credential for GCP. Cloudbreak uses this GCP credentials to create the required resources on GCP.

Following are steps to create GCP credential:

  • In Cloudbreak UI select credentials from Navigation pane and click create credentials.
  • Under cloud provider select Google Cloud Platform.
  • As shown below provide the Google project id, Service Account email id from google project and upload the P12 key that you created the above section.


  • Once you provide all the right details , cloudbreak will create the GCP credential and that should be displayed in the Credential pane. Next article Part 2 covers in detail how to provision the HDP and HDF cluster using the GCP credential.
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.
Version history
Last update:
‎08-17-2019 07:48 AM
Updated by:
Top Kudoed Authors