Support Questions

Find answers, ask questions, and share your expertise

Console producer/consumer not working in kafka 0.10.2 with kerberos

avatar
Expert Contributor

Hi,

We have recently started using kafka 0.10.2 but are unable to produce any messages or consumer them. It has kerberos enabled. Below are my configs. There is no error and kafka data log also doesn't have any entry but the index gets updated whenever we run an producer.

kafka-console-producer --broker-list kafka1.test.com:9092,kafka2.test.com:9092 --producer.config client.properties --topic TEST

kafka-console-consumer --topic TEST --from-beginning --bootstrap-server kafka1.test.com:9092,kafka2.test.com:9092 --consumer.config consumer.properties

jass:

KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true; };

client.properties/consumer.properties:

security.protocol=SASL_PLAINTEXT

sasl.kerberos.service.name=kafka

 

 

 

 

17/11/22 12:43:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 4 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

17/11/22 12:44:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 2 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

17/11/22 12:45:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 5 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

17/11/22 12:46:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 4 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

1 ACCEPTED SOLUTION

avatar
Try to change your producer config:

com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=false
useKeyTab=true
storeKey=true
keyTab="path to the file.keytab"
principal="kafka/ip-10-197-17-69.eu-west-1.compute.internal@DOMAIN"

And try to change the consumer config:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka

View solution in original post

11 REPLIES 11

avatar

Hi,

 

Follow below steps.

1)change Inter Broker Protocol property to SASL_PLAINTEXT in Cloudera manager kafka configuration. and restart kafka service.

 

2)create jaas.conf file in your home path /home/userid/
vi jaas.conf
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=false
useKeyTab=true
serviceName="kafka"
StoreKey=true

#create new keytab by using principal of same user id and put it in below path
keyTab="/home/userid/useridkerberos.keytab"

#below replace with your correct principal name
principal="userid@REALHOSTNAME.COM"
client=true;};

 

3)Create client.properties file containing the following properties in the same path /home/userid/
sudo vi client.properties
security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka

 

4) Login with same user id's login and do kinit

kinit -kt useridkerberos.keytab userid@REALMHOSTNAME.COM

 

5) Cretaing Topics:
----------------
/usr/bin/kafka-topics --create --zookeeper hostname1:2181,hostname2:2181,hostname3:2181/kafka --replication-factor 2 --partitions 2 --topic newtopic1

 

6) Describing Topics:
------------------

/usr/bin/kafka-topics --describe --zookeeper hostname1:2181,hostname2:2181,hostname3:2181/kafka --topic testtopic1

 

7)  export KAFKA_OPTS="-Djava.security.auth.login.config=/home/userid/jaas.conf"
verify it:  echo "$KAFKA_OPTS"

 

😎 Writting message using Producer:
--------------------------------
/usr/bin/kafka-console-producer --broker-list brokerhostname1:9092,brokerhostname2:9092 --topic newtopic6 --producer.config client.properties

 

9) Open duplicate session of smae machine and run consumer command

Reading message using Consumer:
-------------------------------

export KAFKA_OPTS="-Djava.security.auth.login.config=/home/userid/jaas.conf"


/usr/bin/kafka-console-consumer --new-consumer --topic newtopic6 --from-beginning --bootstrap-server brokerhostname1:9092,brokerhostname2:9092 --consumer.config client.properties

 

 

try it. It may work at your end also!!

 

 

 

avatar
Explorer

poojary_sudth:

Thank you, but that is +- the same set of tasks I did before.

 

I found how to make consumer work, it was necessary to add  parameter  --partition 0:

KAFKA_OPTS="-Djava.security.auth.login.config=/root/jaas.conf" kafka-console-consumer --bootstrap-server ourHost:9092 --topic test --consumer.config /root/client.properties --partition 0

 I cannot see all the messages comming into the topic, but at least some of them which fall into specified partition are printed. Which is enought for me to confirm that Kafka broker works. 

I found this hint here:  https://stackoverflow.com/questions/34844209/consumer-not-receiving-messages-kafka-console-new-consu...