With the launch of CDP Public Cloud 7.2.15, Cloudera Streams Messaging for Data Hub deployments has gotten some powerful new features! Streams Messaging now supports Multi-Availability Zone Deployments enabling High Availability, OAuth2 support for Clients connecting to Kafka Brokers and Schema Registry, Streams Messaging Manager Connect UI Changes, Kafka Connect security features, Debezium CDC Connectors, ability to import Kafka data to Atlas and much more.
Streams Messaging High Availability cluster definition and template.
You can use the template and definitions to deploy highly available Streams Messaging clusters that leverage multiple availability zones and ensure that functionality is not degraded when a single availability zone has an outage
Three new cluster definitions are introduced for Streams Messaging. The new definitions are as follows:
- Streams Messaging High Availability for AWS
- Streams Messaging High Availability for Azure (Technical Preview)
- Streams Messaging High Availability for Google Cloud (Technical Preview)
OAuth2 authentication available for Kafka
Oauth2 authentication support is added for the Kafka service. You can now configure Kafka brokers to authenticate clients using Oauth2. For more information, see OAuth2 authentication.
Changes for Streams Messaging Manager Kafka Connect UI
The Streams Messaging Manager for deploying Kafka Connect Connectors has changed from a String based JSON object to a submission form. Other enhancements include:
- Ability to import existing JSON Configurations and populate the form
- Users can import NiFi Flow Definitions for Stateless Connectors and Enhance the forum with the Parameters defined in the NiFi Flow Definition
- Mark forum fields as secret to protect specific configurations by storing them in the new Kafka Connect Secret Storage
- Validation Messages now are shown at the specific field with the issue rather then an error message about the entire JSON configuration being used
Secure Kafka Connect
Kafka Connect is now generally available in the Public Cloud and can be used in production environments. This is the result of multiple changes, improvements, and new features related to Kafka Connect security including the following:
- Ranger support for Kafka Connect that allows policies to be defined at the Connect cluster level or the named connectors themselves allowing for secure multi-tenant experiences
- Secret storage to securely store sensitive configurations used with connectors such as passwords or tokens
- Connect REST API can be secured by enabling SPNEGO authentication.
- Kafka Connect Connectors can be configured to override the JAAS, and restrict the usage of the Worker principal
Debezium Connector support
The following change data capture (CDC) connectors are added to Kafka Connect:
- Debezium MySQL Source
- Debezium Postgres Source
- Debezium SQL Server Source
- Debezium Oracle Source
Each of the connectors require CDP specific steps before they can be deployed. For more information, see Connectors.
Importing Kafka entities into Atlas
Kafka topics and clients can now be imported into Atlas as entities (metadata) using a new action available for the Kafka service in Cloudera Manager. The new action is available at Kafka service>Actions>Import Kafka Topics Into Atlas. The action serves as a replacement/alternative for the kafka-import.sh tool. For more information, see Importing Kafka entities into Atlas.
Learn more about all the features above and more released as part of Cloudera Streams Messaging for Public Cloud Data Hub in the Streams Messaging 7.2.15 Documentation.