Community Articles

Find and share helpful community-sourced technical articles.
Labels (1)
avatar

I found it tricky to make Kafka to work with SSL in a kerberized cluster. In this article I share ambari settings I used and console (producer/consumer) sample commands:

1- Install Ambari and deploy a cluster with Kafka

2- Kerberize cluster using Ambari (it can be AD Wizard, MIT Kerberos or Manual Keytabs)

3- Create keystore and truststore with proper certificates, more details here: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/content/ch_wire-ssl-certs.html

4- Change Kafka Settings on Ambari as below:

Kafka Brokers - Listerners: listeners=SASL_PLAINTEXT://localhost:6667, SASL_SSL://localhost:6668

Custom kafka-broker - security.inter.broker.protocol=SASL_PLAINTEXT

Custom kafka-broker - ssl.client.auth=none

Custom kafka-broker - ssl.key.password=YOUR-PASSWORD

Custom kafka-broker - ssl.keystore.location=/path-to-your-keystore

Custom kafka-broker - ssl.keystore.password=YOUR-PASSWORD

Custom kafka-broker - ssl.truststore.location=/path-to-your-truststore

Custom kafka-broker - ssl.truststore.password=YOUR-PASSWORD

5- Add kafka public certificate to default jdk truststore (cacerts) on node you are going to test console producer/consumer

6- Try producer/consumer SASL PLAINTEXT

#get kerberos ticket
kinit
<TYPE YOUR PASSWORD>

#producer plaintext
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list kafka-node1.domain:6667,kafka-node2.domain:6667 --topic test --security-protocol SASL_PLAINTEXT

#consumer plaintext 
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh --zookeeper zk-node1.domain:2181,zk-node2.domain:2181 --topic test --from-beginning --security-protocol SASL_PLAINTEXT

7- Try producer/consumer with SASL SSL

#kerberos ticket
kinit
<TYPE YOUR PASSWORD>

#producer ssl
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list kafka-node1.domain:6668,kafka-node2.domain:6668 --topic test --new-producer --producer-property "security.protocol=SASL_SSL"

#consumer ssl
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh --new-consumer --bootstrap-server kafka-node1.domain:6668,kafka-node2.domain:6668 --topic test --from-beginning --security-protocol SASL_SSL

Tested with HDP 2.4.2 + Ambari 2.2

14,738 Views
Comments
avatar
Expert Contributor

Thank you for the nice cheat sheet.

I configured according to the cheat sheet above on secure HDP 2.4.2 + Ambari 2.2 cluster. I could send the messages messages using console producer.

<Broker_home>/bin/kafka-console-producer.sh --broker-list <KAFKA_BROKER>:6667 --topic test  --security-protocol SASL_PLAINTEXT

When I am trying to consume the messages (on the same machine) I get error.

Starting like this :

/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh  --zookeeper ZK1:2181,ZK-2:2181,ZK-3:2181 --topic test --from-beginning --security-protocol SASL_PLAINTEXT

Error stack is :

[2016-09-08 13:59:26,167] WARN [console-consumer-39119_HOST_NAME-1473343165849-9f1b8f0d-leader-finder-thread], Failed to find leader for Set([test,0], [test,1]) (kafka.consumer.ConsumerFetcherManager$LeaderFinderThread)
kafka.common.BrokerEndPointNotAvailableException: End point PLAINTEXT not found for broker 0
        at kafka.cluster.Broker.getBrokerEndPoint(Broker.scala:141)
        at kafka.utils.ZkUtils$$anonfun$getAllBrokerEndPointsForChannel$1.apply(ZkUtils.scala:180)
        at kafka.utils.ZkUtils$$anonfun$getAllBrokerEndPointsForChannel$1.apply(ZkUtils.scala:180)

What do you think has gone wrong?

Regards,

SS

PS : Do we have another WIKI page for Best practices around Kafka? @Sriharsha Chintalapani, @Andrew Grande, @Vadim Vaks , @Predrag Minovic, @

avatar
Contributor

I just tried this with HDP 2.5.3 + Ambari 2.4.2, for the SASL_SSL example, I had to make a couple of changes to get it to work.

For producer, When I tried the example able, I got "new-producer is not a recognized option"

I had to create a ssl properties file (I called mine /tmp/kakfka.ssl.properties) then run producer as:

/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list kafka-node1.domain:6668,kafka-node2.domain:6668 -topic test --producer.config /tmp/kakfka.ssl.properties --security-protocol SASL_SSL

Similarly for consumer, I had to add a reference to the ssl.properties file: I used this:

/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh --consumer.config /tmp/kakfka.ssl.properties -bootstrap-server kafka-node1.domain:6668,kafka-node2.domain:6668 --topic test --from-beginning --security-protocol SASL_SSL --new-consumer