Community Articles

Find and share helpful community-sourced technical articles.
avatar
Cloudera Employee

The Kafka console producer & consumer client utilities are very simple to use in an unsecured environment, such as with many legacy CDH clusters or a standalone install.  However CDP-Public Cloud spins up secured by default, which means that we must get our feet wet with Kerberos if we want to use the Kafka client tools.  Kerberos isn't the easiest topic, but thankfully we won't need to get too deep into it. This isn't meant to be a Kerberos primer; we'll cover just enough to achieve the minimal goal of producing and consuming basic messages.

 

Pre-requisites

  • A CDP environment
  • A Streams Messaging data hub
  • root access to the data hub hosts

 

Building a Kafka Configuration
The Kafka client utilities require a number of ssl related properties in order to talk to Kafka. We'll put the properties into a configuration file named kafka-ssl.config for ease of use, which will look something like this boilerplate:

 

 

 

 

security.protocol = SASL_SSL
sasl.mechanism=GSSAPI
sasl.kerberos.service.name = kafka
ssl.truststore.type = jks
ssl.truststore.location = ""
ssl.truststore.password = ""
sasl.jaas.config =

 

 

 


Fortunately these values can all be found on the server if we know where to look, although we need to gain root access to find most of them.

 

truststore location:

 

 

 

ls -lart /run/cloudera-scm-agent/process/*kafka-KAFKA_BROKER/cm-auto-global_truststore.jks

 

 

 

Note the wildcard in the path. Every Streams Messaging data hub will have a slightly different path here so we need the wildcard to find it.

 

truststore password:

 

 

 

cat /var/run/cloudera-scm-agent/process/*kafka-KAFKA_BROKER/proc.json | grep KAFKA_BROKER_TRUSTORE_PASSWORD

 

 

 

Note the spelling of "TRUSTORE" here. (It must be Italian.) 

 

keytab location:

 

 

 

ls -lart /run/cloudera-scm-agent/process/*kafka-KAFKA_BROKER/kafka.keytab

 

 

 

 

jass.config:

 

 

 

cat /run/cloudera-scm-agent/process/*kafka-KAFKA_BROKER/jaas.conf

 

 

 

The jass.config is actually a collection of configurations referencing various login modules, but we will use the Krb5LoginModule section under KafkaServer. You can copy & paste that section into the sasl.jaas.config section of our Kafka configuration.

 

 

 

KafkaServer {

com.sun.security.auth.module.Krb5LoginModule required
doNotPrompt=true
useKeyTab=true
storeKey=true
keyTab="/var/run/cloudera-scm-agent/process/1546335853-kafka-KAFKA_BROKER/kafka.keytab"
principal="kafka/cnelson2-streams-corebroker0.se-sandb.a465-9q4k.cloudera.site@SE-SANDB.A465-9Q4K.CLOUDERA.SITE";

org.apache.kafka.common.security.scram.ScramLoginModule required
;

org.apache.kafka.common.security.plain.PlainLoginModule required
ldap_url="ldaps://ldap.se-sandb.a465-9q4k.cloudera.site:636"
user_dn_template="uid={0},cn=users,cn=accounts,dc=se-sandb,dc=a465-9q4k,dc=cloudera,dc=site";
};

Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/var/run/cloudera-scm-agent/process/1546335853-kafka-KAFKA_BROKER/kafka.keytab"
principal="kafka/cnelson2-streams-corebroker0.se-sandb.a465-9q4k.cloudera.site@SE-SANDB.A465-9Q4K.CLOUDERA.SITE";
};

 

 

 


Using all the pieces we just found, your completed configuration should look something like this (your specific truststore password, Kafka broker paths, and Kerberos principal will be unique to your data hub), which I wrote to a file called kafka-ssl.config.  Remember to use continuation slashes and don't miss the trailing semicolon after the principal when pasting the jass.config.

 

 

security.protocol = SASL_SSL
sasl.mechanism=GSSAPI
sasl.kerberos.service.name = kafka
ssl.truststore.type = jks
ssl.truststore.location = /run/cloudera-scm-agent/process/1546335853-kafka-KAFKA_BROKER/cm-auto-global_truststore.jks
ssl.truststore.password = 456sbar15i29vaicvcv3s6f9o4
sasl.jaas.config = \
com.sun.security.auth.module.Krb5LoginModule required \
doNotPrompt=true \
useKeyTab=true \
storeKey=true \
keyTab="/var/run/cloudera-scm-agent/process/1546335853-kafka-KAFKA_BROKER/kafka.keytab" \
principal="kafka/cnelson2-streams-corebroker0.se-sandb.a465-9q4k.cloudera.site@SE-SANDB.A465-9Q4K.CLOUDERA.SITE";

 

 

 


Verify Kerberos Ticket

We should first check if we have a valid Kerberos ticket by issuing a klist:

kafka-klist-before.png

If no valid ticket is returned, we'll have to generate one. The first step is to identify our principal by reading the keytab via ktutil using the keytab & principal we found earlier.  The astute reader will recognize that the principal is also found in the jaas.config, but it's handy to know how to use ktuil, so I'll demonstrate that just because.  It will list several principals, but we're looking for the one for kafka.

kafka-ktutil.png

Once we have identified the principal, we can use kinit to create a Kerberos ticket.

 

 

 

kinit -kt PATH_TO_KEYTAB FULL_PRINCIPAL

 

 

 

kafka-kinit.png

Now that we have a valid ticket, we can think about actually using the Kafka client utilities.


Actually Produce & Consume Messages

kafka-console-producer & kafka-console-consumer are tools that are installed on the broker nodes of your Streams Messaging cluster.  They require minimal information to get the basics working, primarily consisting of the FQDN of the brokers (remember SSL uses port 9093), and the topic to which you want to produce.  The consumer additionally needs a consumer group. If we run a producer in one window and a consumer in another window you can verify both are working in real time.  Of course there are many other command line options you can supply, but these are the bare minimum to get it flowing.

 

In one terminal window run this to begin producing:

 

 

kafka-console-producer --broker-list cnelson2-streams-corebroker0.se-sandb.a465-9q4k.cloudera.site:9093,cnelson2-streams-corebroker1.se-sandb.a465-9q4k.cloudera.site:9093,cnelson2-streams-corebroker2.se-sandb.a465-9q4k.cloudera.site:9093 \
--topic test.topic \
--producer.config kafka-ssl.config

 

 

 

 

And in another terminal window run this to begin consuming:

 

 

kafka-console-consumer --bootstrap-server cnelson2-streams-corebroker0.se-sandb.a465-9q4k.cloudera.site:9093,cnelson2-streams-corebroker1.se-sandb.a465-9q4k.cloudera.site:9093,cnelson2-streams-corebroker2.se-sandb.a465-9q4k.cloudera.site:9093 \
--topic test.topic \
--consumer.config kafka-ssl.config \
--group myConsumerGroup \
--from-beginning

 

 

 


Once the producer is up and running, you can simply type messages into the console and you will see them show up in the consumer window.   


producing & consuming, with a dose of truth.producing & consuming, with a dose of truth.

 

And there you have it.   

3,846 Views