Created on 01-26-2017 10:14 AM
I found it is a little tricky to get started with a Kerberos enabled Kafka cluster. I created this step by step recipe for securing Kafka with Kerberos, sending and receiving data on console. This is tested on HDP2.5.0 and Ambari 2.4.1.
$ cd /usr/hdp/current/kafka-broker/bin $ sudo su kafka $ kinit -k -t /etc/security/keytabs/kafka.service.keytab kafka/ip-10-0-1-130.ap-northeast-1.compute.internal $ ./kafka-topics.sh --zookeeper ip-10-0-1-130.ap-northeast-1.compute.internal:2181 --create --topic foo --partitions 1 --replication-factor 1 Created topic "bar".
# Grant user bob as producer on topic foo ./kafka-acls.sh --authorizer-properties zookeeper.connect=ip-10-0-1-130.ap-northeast-1.compute.internal:2181 \ --add --allow-principal User:bob \ --producer --topic foo Adding ACLs for resource `Topic:foo`: User:bob has Allow permission for operations: Describe from hosts: * User:bob has Allow permission for operations: Write from hosts: * Adding ACLs for resource `Cluster:kafka-cluster`: User:bob has Allow permission for operations: Create from hosts: * Current ACLs for resource `Topic:foo`: User:bob has Allow permission for operations: Describe from hosts: * User:bob has Allow permission for operations: Write from hosts: * # Grant user bob as consumer ./kafka-acls.sh --authorizer-properties zookeeper.connect=ip-10-0-1-130.ap-northeast-1.compute.internal:2181 \ --add --allow-principal User:bob \ --consumer --topic foo --group * Adding ACLs for resource `Topic:foo`: User:bob has Allow permission for operations: Read from hosts: * User:bob has Allow permission for operations: Describe from hosts: * Adding ACLs for resource `Group:connect-distributed.sh`: User:bob has Allow permission for operations: Read from hosts: * Current ACLs for resource `Topic:foo`: User:bob has Allow permission for operations: Read from hosts: * User:bob has Allow permission for operations: Describe from hosts: * User:bob has Allow permission for operations: Write from hosts: * Current ACLs for resource `Group:connect-distributed.sh`: User:bob has Allow permission for operations: Read from hosts: *
# Switch to bob user and log in to KDC. $ kinit bob # Start console producer $ ./kafka-console-producer.sh --broker-list ip-10-0-1-130.ap-northeast-1.compute.internal:6667 --topic foo --security-protocol PLAINTEXTSASL # On another terminal, start console consumer ./kafka-console-consumer.sh --zookeeper ip-10-0-1-130.ap-northeast-1.compute.internal:2181 --topic foo --security-protocol PLAINTEXTSASL {metadata.broker.list=ip-10-0-1-130.ap-northeast-1.compute.internal:6667, request.timeout.ms=30000, client.id=console-consumer-57797, security.protocol=PLAINTEXTSASL} # Type something on the producer terminal, it should appears on the console terminal immediately.
Created on 03-07-2017 05:05 PM
@yjiang Creating topics with kafka user is a good practice. But If you want to create a topic as a non kafka user in a kerberized environment you need to workaround by following below steps :
If you are not using Ranger :
1. Make sure "auto.create.topic.enable = true"
2. Give acl's for the user from which you want to create a topic, for ex :
# bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:Bob --producer --topic Test-topic
3. Do a kinit as a user from which you want to create topic.
4. Now try to produce messages to topic as that user :
# ./kafka-console-producer.sh --broker-list <hostname-broker>:6667 --topic Test-topic --security-protocol PLAINTEXTSASL
If you are using Ranger :
Instead of point 2 in above steps you will need to add a policy for the topic in ranger. Allow permissions for that user to produce, create, consume. Restart kafka service. Then follow step 3 and 4 as mentioned above.