Member since
01-07-2019
217
Posts
135
Kudos Received
18
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2127 | 12-09-2021 09:57 PM | |
1955 | 10-15-2018 06:19 PM | |
9455 | 10-10-2018 07:03 PM | |
4282 | 07-24-2018 06:14 PM | |
1559 | 07-06-2018 06:19 PM |
05-29-2019
11:56 PM
1 Kudo
Cloudbreak 2.9.1 maintenance release is now available. If you are using an earlier version of Cloudbreak, you can upgrade now to pick up the latest bug fixes. If you are new to Cloudbreak, you can get started by launching Cloudbreak on AWS, Azure, GCP, or OpenStack from a template. Useful links: Release notes Upgrade steps Get started on AWS Get started on Azure Get started on GCP
... View more
Labels:
04-08-2019
06:31 PM
If you would like to upgrade Cloudbreak in an environment with no internet access, refer to the following documentation: https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.9.0/upgrade/content/cb_upgrade-cloudbreak-no-internet.html
... View more
Labels:
04-08-2019
06:28 PM
If you would like to configure Cloudbreak to use an external database that uses SSL, refer to the following documentation: https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.9.0/configure/content/cb_configure-external-ssl.html
... View more
Labels:
02-12-2019
07:39 AM
When i use WASB as storage, while creating cluster i need to have Master node and compute nodes only right ? No need to have worker node as i am using WASB not HDFS ?
... View more
02-07-2019
04:54 PM
5 Kudos
Cloudbreak 2.9.0 is now available! It is a general availability (GA) release, so - with an exception of some features that are marked as TP - it is suitable for production. Try it now Upgrade to 2.9.0 Quickly deploy by using quickstart on AWS, Azure, or Google Cloud Install manually on AWS, Azure, Google Cloud, or OpenStack New features Cloudbreak 2.9.0 introduces the following new features. While some of these features were introduced in Cloudbreak 2.8.0 TP, others are brand new: Feature Description Documentation Specifying resource group name on Azure When creating a cluster on Azure, you can specify the name for the new resource group where the cluster will be deployed. Resource group name Multiple existing security groups on AWS When creating a cluster on AWS, you can select multiple existing security groups. This option is available only when an existing VPC is selected. Create a cluster on AWS EBS volume encryption on AWS You can optionally configure encryption for EBS volumes attached to cluster instances running on EC2. Default or customer-managed encryption keys can be used. EBS encryption on AWS Shared VPCs on GCP When creating a cluster on Google Cloud, you can place it in an existing shared VPC. Shared networks on GCP GCP volume encryption By default, Google Compute Engine encrypts data at rest stored on disks. You can optionally configure encryption for the encryption keys used for disk encryption. Customer-supplied (CSEK) or customer-managed (CMEK) encryption keys can be used. Disk encryption on GCP Workspaces Cloudbreak introduces a new authorization model, which allows resource sharing via workspaces. In addition to a default personal workspace, users can create additional shared workspaces. Workspaces Operations audit logging Cloudbreak records an audit trail of the actions performed by Cloudbreak users as well as those performed by the Cloudbreak application. Operations audit logging Updating long-running clusters Cloudbreak supports updating base image's operating system and any third party packages that have been installed, as well as upgrading Ambari, HDP and HDF. Updating OS and tools on long-running clusters and Updating Ambari and HDP/HDF on long-running clusters HDP 3.1 Cloudbreak introduces two default HDP 3.1 blueprints and allows you to create your custom HDP 3.1 blueprints. Default cluster configurations HDF 3.3 Cloudbreak introduces two default HDF 3.3 blueprints and allows you to create your custom HDP 3.3 blueprints. To get started, refer to How to create a NiFi cluster HCC post. Default cluster configurations Recipe parameters Supported parameters can be specified in recipes as variables by using mustache kind of templating with "{{{ }}}" syntax. Writing recipes and Recipe parameters Shebang in Python recipes Cloudbreak supports using shebang in Python scripts run as recipes. Writing recipes Technical preview features The following features are technical preview (not suitable for production): Feature Description Documentation AWS GovCloud (TP) You can install Cloudbreak and create Cloudbreak-managed clusters on AWS GovCloud. Deploying on AWS GovCloud Azure ADLS Gen2 (TP) When creating a cluster on Azure, you can optionally configure access to ADLS Gen2. This feature is technical preview. Configuring access to ADLS Gen2 New and changed data lake blueprints (TP) Cloudbreak includes three data lake blueprints, two for HDP 2.6 (HA and Atlas) and one for HDP 3.1. Note that Hive Metastore has been removed from the HDP 3.x data lake blueprints, but setting up an external database allows all clusters attached to a data lake to connect to the same Hive Metastore. To get started with data lakes, refer to How to create a data lake with Cloudbreak 2.9 HCC post. Working with data lakes Default blueprints Cloudbreak 2.9.0 includes the following HDP 2.6, HDP 3.1, and HDF 3.3 workload cluster blueprints. In addition, HDP 3.1 and HDP 2.6 data lake blueprints are available as technical preview. Note that Hive Metastore has been removed from the HDP 3.x data lake blueprints, but setting up an external database allows all clusters attached to a data lake to connect to the same Hive Metastore. Documentation links How to create a data lake with Cloudbreak 2.9 (HCC post) How to create a NiFi cluster (HCC post) Cloudbreak 2.9.0 documentation (Official docs) Release notes (Official docs)
... View more
Labels:
10-11-2018
01:32 AM
2 Kudos
Cloudbreak 2.7.2 maintenance release is now available. If you are using an earlier version of Cloudbreak, you can upgrade now to pick up the latest bug fixes. If you are new to Cloudbreak, you can get started by launching Cloudbreak on AWS, Azure, GCP, or OpenStack from a template. Useful links: Release notes Upgrade steps Get started
... View more
Labels:
10-10-2018
10:19 PM
@Dominika Bialek Thanks got it.
... View more
09-20-2018
04:42 PM
I'm glad the features are helpful. Sorry, I'm not authorized to share the roadmap outside of Hortonworks. All the best 🙂
... View more
09-17-2018
06:30 PM
@Jakub Igla That's great! This article that I posted describes very basic functionality. I posted it because I realized that now everyone knew about it.
... View more
08-24-2018
09:02 PM
In the following video, I demonstrate how to create a Cloudbreak credential on Google Cloud
Video link: https://youtu.be/uVYpgz9m4eE
The Cloudbreak version used in this video is Cloudbreak 2.7.1.
To obtain the roles that need to be assigned to the service account, refer to Service account for GCP credential documentation.
If you are using a corporate Google Cloud account you may be unable to perform some of the steps (such as service account creation and role assignment) by yourself and you may have to contact your Google Cloud admin to perform these steps for you.
If you are using a Cloudbreak version different than 2.7.1, refer to the equivalent documentation for that version. You can access Cloudbreak documentation from the Hortonworks docs page at https://docs.hortonworks.com.
... View more
Labels: