Community Articles

Find and share helpful community-sourced technical articles.
Announcements
Celebrating as our community reaches 100,000 members! Thank you!
Labels (1)
avatar
Cloudera Employee

Steps:

  • When configuring a Kafka broker to use only SSL, you can have authentication and encryption by enabling 2-ways SSL by using parameter ssl.client.auth=required.
  • Go to Ambari > Kafka > configs > custom Kafka-broker, add ssl.client.auth=required parameter.
  • In this case certificate owner name (subject) works as username & authentication happens with this username. You have to add server certificate owner name from each broker as superuser (usernames separated by semicolon) so that each broker can access resources from other brokers.
Example: In the case of two brokers you need to add two server certificate owner names to superuser.
Go to Ambari > kafka > configs > custom kafka-broker, configure super.users parameter
super.users = User:CN=user1,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx;User:CN=user2,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx
  • You can create different client certificates for different users, if client certificate owner name is "CN=kafka,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx" then you can set your topic ACLs to allow this user to produce/consume data.
  • For producing data you can execute below commands
    ./kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=<zkhost>:<port> --add  --allow-principal User:"CN=kafka,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx"  --cluster --producer --topic test
    ./kafka-console-producer.sh --broker-list <broker>:<port> --topic test --producer.config <client-ssl-path>/client-ssl.properties  --security-protocol SSL
  • client-ssl.properties:
    security.protocol=SSL
    ssl.truststore.location=<client.truststore.jks>
    ssl.truststore.password=<truststore-password>
    ssl.keystore.location=<client.keystore.jks>
    ssl.keystore.password=<keystore-password>
    ssl.key.password=<key-password>
  • For Consuming data you can execute below commands
    ./kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=<zkhost>:<port> --add  --allow-principal User:"CN=kafka,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx"  --group=* --consumer --topic test
    ./kafka-console-consumer.sh --bootstrap-server <broker>:<port> --topic test --security-protocol SSL --consumer.config <client-ssl-path>/client-ssl.properties --from-beginning
  • If Username contains any other attribute apart from (CN, L, ST, O, OU, C, STREET), it will be emitted as Object Identifier (OID) which is difficult to identify. So you can implement CustomPrincipalBuilder class by implementing PrincipalBuilder Interface just to extract the simple username from certificate owner name.
    • Build the custom jar & copy the jar to <kafka-broker>/libs/ folder in all brokers
    • Configure below parameter in Ambari > kafka > configs > custom kafka-broker to load the custom class
    • principal.builder.class=kafka.security.auth.CustomPrincipalBuilder
  • 5,808 Views