Support Questions

Find answers, ask questions, and share your expertise

Console producer/consumer not working in kafka 0.10.2 with kerberos

avatar
Expert Contributor

Hi,

We have recently started using kafka 0.10.2 but are unable to produce any messages or consumer them. It has kerberos enabled. Below are my configs. There is no error and kafka data log also doesn't have any entry but the index gets updated whenever we run an producer.

kafka-console-producer --broker-list kafka1.test.com:9092,kafka2.test.com:9092 --producer.config client.properties --topic TEST

kafka-console-consumer --topic TEST --from-beginning --bootstrap-server kafka1.test.com:9092,kafka2.test.com:9092 --consumer.config consumer.properties

jass:

KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true; };

client.properties/consumer.properties:

security.protocol=SASL_PLAINTEXT

sasl.kerberos.service.name=kafka

 

 

 

 

17/11/22 12:43:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 4 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

17/11/22 12:44:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 2 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

17/11/22 12:45:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 5 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

17/11/22 12:46:01 ERROR internals.ErrorLoggingCallback: Error when sending message to topic TEST with key: null, value: 4 bytes with error: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

1 ACCEPTED SOLUTION

avatar
Try to change your producer config:

com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=false
useKeyTab=true
storeKey=true
keyTab="path to the file.keytab"
principal="kafka/ip-10-197-17-69.eu-west-1.compute.internal@DOMAIN"

And try to change the consumer config:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka

View solution in original post

11 REPLIES 11

avatar
Expert Contributor

Tried doing using kafka keytab. Below is the topic created with kafka service keytab. But its the same issue

Topic:Hello1 PartitionCount:2 ReplicationFactor:2 Configs:min.insync.replicas=2

Topic: Hello1 Partition: 0 Leader: 35 Replicas: 35,38 Isr: 35,38

Topic: Hello1 Partition: 1 Leader: 38 Replicas: 38,33 Isr: 38,33

Producer:

=======kafka-verifiable-producer.sh --topic Hello1 --broker-list server1.kafka2.pre.corp:9092,server2.kafka2.pre.corp:9092 --producer.config client.properties

17/11/23 09:58:07 INFO utils.AppInfoParser: Kafka version : 0.10.2-kafka-2.2.0 17/11/23 09:58:07 INFO utils.AppInfoParser: Kafka commitId : unknown

1 2 3 4 45

17/11/23 09:58:25 INFO producer.KafkaProducer: Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. {"name":"shutdown_complete"} {"sent":1,"name":"tool_data","avg_throughput":0.0,"target_throughput":-1,"acked":0}

Consumer:

kafka-console-consumer --topic Hello1 --from-beginning --zookeeper server.kafka2.pre.corp:2181,server.kafka2.pre.corp:2181/kafka2dc1pre --consumer.config consumer.properties

17/11/23 09:59:04 INFO utils.ZKCheckedEphemeral: Creating /consumers/console-consumer-59653/ids/console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7 (is it secure? false)

17/11/23 09:59:04 INFO utils.ZKCheckedEphemeral: Result of znode creation is: OK

.

.

17/11/23 09:59:05 INFO consumer.ZookeeperConsumerConnector: [console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7], end rebalancing consumer console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7 try #0

17/11/23 09:59:05 INFO consumer.ZookeeperConsumerConnector: [console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7], Creating topic event watcher for topics Hello1

17/11/23 09:59:05 INFO consumer.ZookeeperConsumerConnector: [console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7], Topics to consume = ArrayBuffer(Hello1)

17/11/23 09:59:05 WARN consumer.ConsumerFetcherManager$LeaderFinderThread: [console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7-leader-finder-thread], Failed to find leader for Set(Hello1-0, Hello1-1) kafka.common.BrokerEndPointNotAvailableException: End point with security protocol PLAINTEXT not found for broker 33 at kafka.client.ClientUtils$anonfun$getPlaintextBrokerEndPoints$1$anonfun$apply$5.apply(ClientUtils.scala:146) at kafka.client.ClientUtils$anonfun$getPlaintextBrokerEndPoints$1$anonfun$apply$5.apply(ClientUtils.scala:146) at scala.Option.getOrElse(Option.scala:121) at kafka.client.ClientUtils$anonfun$getPlaintextBrokerEndPoints$1.apply(ClientUtils.scala:146) at kafka.client.ClientUtils$anonfun$getPlaintextBrokerEndPoints$1.apply(ClientUtils.scala:142) at scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.AbstractTraversable.map(Traversable.scala:104) at kafka.client.ClientUtils$.getPlaintextBrokerEndPoints(ClientUtils.scala:142) at kafka.consumer.ConsumerFetcherManager$LeaderFinderThread.doWork(ConsumerFetcherManager.scala:67) at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:63)

17/11/23 09:59:05 INFO consumer.ConsumerFetcherManager: [ConsumerFetcherManager-1511431144680] Added fetcher for partitions ArrayBuffer()

17/11/23 09:59:05 WARN consumer.ConsumerFetcherManager$LeaderFinderThread: [console-consumer-59653_server1.kafka2.pre.corp-1511431144657-52e4b1e7-leader-finder-thread], Failed to find leader for Set(Hello1-0, Hello1-1) kafka.common.BrokerEndPointNotAvailableException: End point with security protocol PLAINTEXT not found for broker 33 at kafka.client.ClientUtils$anonfun$getPlaintextBrokerEndPoints$1$anonfun$apply$5.apply(ClientUtils.scala:146)

avatar
Try to change your producer config:

com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=false
useKeyTab=true
storeKey=true
keyTab="path to the file.keytab"
principal="kafka/ip-10-197-17-69.eu-west-1.compute.internal@DOMAIN"

And try to change the consumer config:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka

avatar
Expert Contributor

@Tomas79

 

Thanks, This worked.

avatar

HI Rajesh,

 

I am also facing same issue while trying to send message using kafka-producer. I am using 2 brokers in my cluster. I have created topic by logging into one of broker host machine. When I run below command am its not going to message propmpt. when I trype something and enter it gives error.

 

command: /usr/bin/kafka-console-producer --broker-list hostname1:9092,hostname2:9092 --topic testtopic1

 

Result:

18/08/22 22:51:32 INFO producer.ProducerConfig: ProducerConfig values:
compression.type = none
metric.reporters = []
metadata.max.age.ms = 300000
metadata.fetch.timeout.ms = 60000
reconnect.backoff.ms = 50
sasl.kerberos.ticket.renew.window.factor = 0.8
bootstrap.servers = [b2brp-cdh-cmsn0.hostanameXXXXXXXX:9092, b2brp-cdh-cmsn1.hostanameXXXXXXXX:9092]
retry.backoff.ms = 100
sasl.kerberos.kinit.cmd = /usr/bin/kinit
buffer.memory = 33554432
timeout.ms = 30000
key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
ssl.keystore.type = JKS
ssl.trustmanager.algorithm = PKIX
block.on.buffer.full = false
ssl.key.password = null
max.block.ms = 60000
sasl.kerberos.min.time.before.relogin = 60000
connections.max.idle.ms = 540000
ssl.truststore.password = null
max.in.flight.requests.per.connection = 5
metrics.num.samples = 2
client.id = console-producer
ssl.endpoint.identification.algorithm = null
ssl.protocol = TLS
request.timeout.ms = 1500
ssl.provider = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
acks = 0
batch.size = 16384
ssl.keystore.location = null
receive.buffer.bytes = 32768
ssl.cipher.suites = null
ssl.truststore.type = JKS
security.protocol = PLAINTEXT
retries = 3
max.request.size = 1048576
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
ssl.truststore.location = null
ssl.keystore.password = null
ssl.keymanager.algorithm = SunX509
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
send.buffer.bytes = 102400
linger.ms = 1000

18/08/22 22:51:32 INFO utils.AppInfoParser: Kafka version : 0.9.0-kafka-2.0.2
18/08/22 22:51:32 INFO utils.AppInfoParser: Kafka commitId : unknown
hi
18/08/22 22:51:40 WARN clients.NetworkClient: Bootstrap broker b2brp-cdh-cmsn1.hostanameXXXXXXXX:9092 disconnected
18/08/22 22:51:41 WARN clients.NetworkClient: Bootstrap broker b2brp-cdh-cmsn0.hostanameXXXXXXXX:9092 disconnected

 

Can I know in which path i can find 2 config files(consumer.properties & producer.properties)

 

Thanks in advance.

avatar
Explorer

I got exactly the same problem with the same question.

 

...

18/08/29 15:38:00 INFO utils.AppInfoParser: Kafka version : 1.0.1-kafka-3.1.0-SNAPSHOT
18/08/29 15:38:00 INFO utils.AppInfoParser: Kafka commitId : unknown
>hello
18/08/29 15:38:13 WARN clients.NetworkClient: [Producer clientId=console-producer] Bootstrap broker ourhost:9092 (id: -1 rack: null) disconnected

...

 

My Kafka parcel version is: 3.1.0-1.3.1.0.p0.35

avatar

You'll have to create the client.properties file, as noted in the "Step 5. Configuring Kafka Clients" here:
https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html

 

cat >/root/client.properties<<EOF
security.protocol=SASL_SSL
sasl.kerberos.service.name=kafka
ssl.client.auth=none
ssl.truststore.location=/etc/cdep-ssl-conf/CA_STANDARD/truststore.jks
ssl.truststore.password=cloudera
EOF

cat >/root/jaas.conf<<EOF
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=false
useTicketCache=true
keyTab="/cdep/keytabs/kafka.keytab"
principal="kafka@EXAMPLE.CLOUDERA.COM";
};
EOF

KAFKA_OPTS="-Djava.security.auth.login.config=/root/jaas.conf" kafka-console-producer --broker-list ${HOSTNAME}:9093 --topic test --producer.config /root/client.properties

avatar
Explorer

pdvorak: Thank you, it lead me to running producer and consumer without errors. I just modified configuration to unsecured 9092 port.

  jaas.conf:
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=false
useTicketCache=true
keyTab="somePathToKeytab"
principal="somePrincipal";
};
  client.properties:
security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka
sasl.mechanism=GSSAPI
ssl.client.auth=none

I can start producer and consumer with the following commands now:

  producer:
KAFKA_OPTS="-Djava.security.auth.login.config=/root/jaas.conf" kafka-console-producer --broker-list ourHost:9092 --topic test --producer.config /root/client.properties
  consumer:
KAFKA_OPTS="-Djava.security.auth.login.config=/root/jaas.conf" kafka-console-consumer --bootstrap-server ourHost:9092 --topic test --consumer.config /root/client.properties

Although, it does not throw any errors, consumer does not print any messages. Do you have any idea where might the problem be now?

I can see number of sent messages in Cloudera Manager Chart "Total Messages Received Across Kafka Brokers". So, I assume they were sent properly by a producer.

avatar
Expert Contributor

Hi,

 

Try only below in your consumer properties file.

 

security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka
group.id=testgroup

avatar
Explorer

Hi,

 

I tried and it did not help, consumer still prints nothing. Could it be blocked by Sentry? We did not configure topic-permissions yet, however producer can write there.

 

I also tried to consume from non-existing topic and in this case it properly shows Error with fetching metadata.