<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58064#M45032</link>
    <description>Try turning on DEBUG for the client commands. If you have a kafka gateway installed on that node, you can set the DEBUG level in the kafka service, otherwise, modify the /etc/kafka/conf/tools-log4j.properties to set the log level to DEBUG, and then run your producer or consumer.&lt;BR /&gt;&lt;BR /&gt;Some "retryable" errors when security is enabled keep the clients from properly connecting.&lt;BR /&gt;&lt;BR /&gt;-pd</description>
    <pubDate>Wed, 26 Jul 2017 19:05:41 GMT</pubDate>
    <dc:creator>pdvorak</dc:creator>
    <dc:date>2017-07-26T19:05:41Z</dc:date>
    <item>
      <title>Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58061#M45031</link>
      <description>&lt;P&gt;I recently installed Kafka onto an already secured cluster. I've configured Kafka to use Kerberos and SSL, and set the protocol to SASL_SSL, roughly following the documentation here (I used certificates already created):&amp;nbsp;&lt;A href="https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html" target="_blank"&gt;https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I bring up kafka-console-consumer, a few minor log messages come up, and then it sits waiting for messages correctly. When I bring up kafka-console-producer, the same happens. I am pointing both to the same node which is both a Kafka broker and a Zookeeper node, with port&amp;nbsp;9092 for the produer, and port 2181 for the consumer. If I type something into the console for the producer, however, nothing will happen for a while, and then I will get the following error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;17/07/26 13:11:20 ERROR internals.ErrorLoggingCallback: Error when sending message to topic test with key: null, value: 5 bytes with error:
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The Kafka logs in that timeframe don't seem to have any errors or warnings. The Zookeeper logs are also clean except for one warning that shows up only in the log of the zookeeper node I am pointing the consumer to:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;2017-07-26 13:10:17,379 WARN org.apache.zookeeper.server.NIOServerCnxn: Exception causing close of session 0x0 due to java.io.EOFException&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any ideas on what would cause this behavior or how to further debug what the issue is?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 11:59:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58061#M45031</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2022-09-16T11:59:20Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58064#M45032</link>
      <description>Try turning on DEBUG for the client commands. If you have a kafka gateway installed on that node, you can set the DEBUG level in the kafka service, otherwise, modify the /etc/kafka/conf/tools-log4j.properties to set the log level to DEBUG, and then run your producer or consumer.&lt;BR /&gt;&lt;BR /&gt;Some "retryable" errors when security is enabled keep the clients from properly connecting.&lt;BR /&gt;&lt;BR /&gt;-pd</description>
      <pubDate>Wed, 26 Jul 2017 19:05:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58064#M45032</guid>
      <dc:creator>pdvorak</dc:creator>
      <dc:date>2017-07-26T19:05:41Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58066#M45033</link>
      <description>&lt;P&gt;I've set the Kafka Broker Logging Threshold to DEBUG, and am seeing DEBUG statements in the Kafka Broker logs. It obviously puts out a lot of information, but I haven't come across anything that looked to be interesting or useful.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This cluster does not have a gateway instance at all.&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2017 19:26:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58066#M45033</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2017-07-26T19:26:35Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58068#M45034</link>
      <description>You need the DEBUG level set for the clients, not the brokers. This would be in the "Gateway Logging Threshold", or on the system where you are running the console commands by editing /etc/kafka/conf/tools-log4j.properties&lt;BR /&gt;&lt;BR /&gt;-pd</description>
      <pubDate>Wed, 26 Jul 2017 19:35:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58068#M45034</guid>
      <dc:creator>pdvorak</dc:creator>
      <dc:date>2017-07-26T19:35:35Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58069#M45035</link>
      <description>&lt;P&gt;Ah OK, I apologize, I didn't realize the logs were separately controlled.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I enabled that, both consumer and producer come back with errors constantly.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The consumer shows the following stack trace as soon as it is started constantly until I close the consumer:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;17/07/26 14:44:40 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to SEND_HANDSHAKE_REQUEST
17/07/26 14:44:40 DEBUG authenticator.SaslClientAuthenticator: Creating SaslClient: client=svcnonprodhadoop@&amp;lt;DOMAIN&amp;gt;;service=kafka;serviceHostname=svd0hdatn01.&amp;lt;DOMAIN&amp;gt;;mechs=[GSSAPI]
17/07/26 14:44:40 DEBUG network.Selector: Created socket with SO_RCVBUF = 65536, SO_SNDBUF = 124928, SO_TIMEOUT = 0 to node -1
17/07/26 14:44:40 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to RECEIVE_HANDSHAKE_RESPONSE
17/07/26 14:44:40 DEBUG clients.NetworkClient: Completed connection to node -1.  Fetching API versions.
17/07/26 14:44:40 DEBUG network.Selector: Connection with svd0hdatn01.&amp;lt;DOMAIN&amp;gt;/10.96.88.42 disconnected
java.io.EOFException
        at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:83)
        at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71)
        at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.receiveResponseOrToken(SaslClientAuthenticator.java:242)
        at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.authenticate(SaslClientAuthenticator.java:166)
        at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:71)
        at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:350)
        at org.apache.kafka.common.network.Selector.poll(Selector.java:303)
        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:370)
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:226)
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:203)
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.awaitMetadataUpdate(ConsumerNetworkClient.java:138)
        at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureCoordinatorReady(AbstractCoordinator.java:219)
        at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureCoordinatorReady(AbstractCoordinator.java:196)
        at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll(ConsumerCoordinator.java:281)
        at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1030)
        at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:996)
        at kafka.consumer.NewShinyConsumer.&amp;lt;init&amp;gt;(BaseConsumer.scala:55)
        at kafka.tools.ConsoleConsumer$.run(ConsoleConsumer.scala:69)
        at kafka.tools.ConsoleConsumer$.main(ConsoleConsumer.scala:50)
        at kafka.tools.ConsoleConsumer.main(ConsoleConsumer.scala)
17/07/26 14:44:40 DEBUG clients.NetworkClient: Node -1 disconnected.
17/07/26 14:44:40 DEBUG clients.NetworkClient: Give up sending metadata request since no node is available
17/07/26 14:44:40 DEBUG clients.NetworkClient: Initialize connection to node -1 for sending metadata request
17/07/26 14:44:40 DEBUG clients.NetworkClient: Initiating connection to node -1 at svd0hdatn01.&amp;lt;DOMAIN&amp;gt;:2181.&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The producer shows the following log output as soon as any input is given to put into the topic:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;17/07/26 14:45:43 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to SEND_HANDSHAKE_REQUEST
17/07/26 14:45:43 DEBUG authenticator.SaslClientAuthenticator: Creating SaslClient: client=svcnonprodhadoop@&amp;lt;DOMAIN&amp;gt;;service=kafka;serviceHostname=svd0hdatn01.&amp;lt;DOMAIN&amp;gt;;mechs=[GSSAPI]
17/07/26 14:45:43 DEBUG network.Selector: Connection with svd0hdatn01.&amp;lt;DOMAIN&amp;gt;/&amp;lt;IP_ADDRESS&amp;gt; disconnected
java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
        at org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:51)
        at org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:81)
        at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:335)
        at org.apache.kafka.common.network.Selector.poll(Selector.java:303)
        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:370)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:225)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:126)
        at java.lang.Thread.run(Thread.java:745)
17/07/26 14:45:43 DEBUG clients.NetworkClient: Node -1 disconnected.
17/07/26 14:45:43 DEBUG clients.NetworkClient: Give up sending metadata request since no node is available
17/07/26 14:45:44 DEBUG clients.NetworkClient: Initialize connection to node -1 for sending metadata request
17/07/26 14:45:44 DEBUG clients.NetworkClient: Initiating connection to node -1 at svd0hdatn01.&amp;lt;DOMAIN&amp;gt;:9092.&lt;/PRE&gt;</description>
      <pubDate>Wed, 26 Jul 2017 19:50:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58069#M45035</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2017-07-26T19:50:51Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58076#M45036</link>
      <description>Connection refused seems to indicate that this host is not listening on port 9092:&lt;BR /&gt;&lt;BR /&gt;svd0hdatn01.&amp;lt;DOMAIN&amp;gt;:9092&lt;BR /&gt;&lt;BR /&gt;You can see what ports kafka is listening on:&lt;BR /&gt;ps -ef |grep kafka&lt;BR /&gt;netstat -nap |grep &amp;lt;kafka pid&amp;gt;&lt;BR /&gt;&lt;BR /&gt;-pd&lt;BR /&gt;</description>
      <pubDate>Wed, 26 Jul 2017 20:55:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58076#M45036</guid>
      <dc:creator>pdvorak</dc:creator>
      <dc:date>2017-07-26T20:55:59Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58078#M45037</link>
      <description>&lt;P&gt;OK, so it looks like that took care of one problem, but there's still a problem with the consumer. Following your instructions, I found that the kafka broker was operating on port 9093, not 9092.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Fixing that on the producer then caused the same EOF error to come up as I am seeing on the consumer.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;17/07/26 16:04:25 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to SEND_HANDSHAKE_REQUEST
17/07/26 16:04:25 DEBUG authenticator.SaslClientAuthenticator: Creating SaslClient: client=svcnonprodhadoop@&amp;lt;DOMAIN&amp;gt;;service=kafka;serviceHostname=svd0hdatn01;mechs=[GSSAPI]
17/07/26 16:04:25 DEBUG network.Selector: Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 102400, SO_TIMEOUT = 0 to node -1
17/07/26 16:04:25 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to RECEIVE_HANDSHAKE_RESPONSE
17/07/26 16:04:25 DEBUG clients.NetworkClient: Completed connection to node -1.  Fetching API versions.
17/07/26 16:04:25 DEBUG network.Selector: Connection with svd0hdatn01/10.96.88.42 disconnected
java.io.EOFException
        at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99)
        at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71)
        at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.receiveResponseOrToken(SaslClientAuthenticator.java:242)
        at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.authenticate(SaslClientAuthenticator.java:166)
        at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:71)
        at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:350)
        at org.apache.kafka.common.network.Selector.poll(Selector.java:303)
        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:370)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:225)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:126)
        at java.lang.Thread.run(Thread.java:745)
17/07/26 16:04:25 DEBUG clients.NetworkClient: Node -1 disconnected.
17/07/26 16:04:25 DEBUG clients.NetworkClient: Give up sending metadata request since no node is available
17/07/26 16:04:25 DEBUG clients.NetworkClient: Initialize connection to node -1 for sending metadata request
17/07/26 16:04:25 DEBUG clients.NetworkClient: Initiating connection to node -1 at svd0hdatn01:9093.&lt;/PRE&gt;</description>
      <pubDate>Wed, 26 Jul 2017 21:06:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58078#M45037</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2017-07-26T21:06:26Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58085#M45038</link>
      <description>What is your consumer.config file properties, does it have SASL_SSL as the protocol?&lt;BR /&gt;&lt;BR /&gt;Can you verify if its listening with SSL correctly:&lt;BR /&gt;openssl s_client -connect svd0hdatn01:9093&lt;BR /&gt;&lt;BR /&gt;-pd</description>
      <pubDate>Thu, 27 Jul 2017 02:52:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58085#M45038</guid>
      <dc:creator>pdvorak</dc:creator>
      <dc:date>2017-07-27T02:52:56Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58086#M45039</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I realized that the client.properties file was using SASL_PLAINTEXT, not SASL_SSL. Updated appropriately. Hitting a new error now, on both producer and consumer. The following error comes up, and then it quits the program. I've verified that jaas.conf is in KAFKA_OPTS properly.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
        at org.apache.kafka.clients.consumer.KafkaConsumer.&amp;lt;init&amp;gt;(KafkaConsumer.java:718)
        at org.apache.kafka.clients.consumer.KafkaConsumer.&amp;lt;init&amp;gt;(KafkaConsumer.java:597)
        at org.apache.kafka.clients.consumer.KafkaConsumer.&amp;lt;init&amp;gt;(KafkaConsumer.java:579)
        at kafka.consumer.NewShinyConsumer.&amp;lt;init&amp;gt;(BaseConsumer.scala:53)
        at kafka.tools.ConsoleConsumer$.run(ConsoleConsumer.scala:69)
        at kafka.tools.ConsoleConsumer$.main(ConsoleConsumer.scala:50)
        at kafka.tools.ConsoleConsumer.main(ConsoleConsumer.scala)
Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner  authentication information from the user
        at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:93)
        at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:109)
        at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:55)
        at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:84)
        at org.apache.kafka.clients.consumer.KafkaConsumer.&amp;lt;init&amp;gt;(KafkaConsumer.java:657)
        ... 6 more
Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner  authentication information from the user
        at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:899)
        at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:719)
        at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:584)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:762)
        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:203)
        at javax.security.auth.login.LoginContext$4.run(LoginContext.java:690)
        at javax.security.auth.login.LoginContext$4.run(LoginContext.java:688)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:687)
        at javax.security.auth.login.LoginContext.login(LoginContext.java:595)
        at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:55)
        at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:100)
        at org.apache.kafka.common.security.authenticator.LoginManager.&amp;lt;init&amp;gt;(LoginManager.java:52)
        at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:81)
        at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:85)&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also, I ran the command you mentioned above, and everything looks right. SSL handshake read 3151 bytes and wrote 499 bytes using TLS v1.2. If you need more information from it, let me know.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;EDIT: Realized that the properties file actually was wrong. Updating with relevant information because of this.&lt;/P&gt;</description>
      <pubDate>Thu, 27 Jul 2017 03:12:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58086#M45039</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2017-07-27T03:12:52Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58110#M45040</link>
      <description>&lt;P&gt;After researching this a bit, I tried a few more things, none of which changed the error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;I moved to using the keytab for the kafka user (&lt;A href="https://community.cloudera.com/t5/Data-Ingestion-Integration/Unable-to-connect-to-kerberized-Kafka-2-1-0-10-from-Spark-2-1/m-p/56026" target="_blank"&gt;https://community.cloudera.com/t5/Data-Ingestion-Integration/Unable-to-connect-to-kerberized-Kafka-2-1-0-10-from-Spark-2-1/m-p/56026&lt;/A&gt;)&lt;/LI&gt;&lt;LI&gt;I placed the jaas.conf file in /tmp on every node in the cluster and pointed KAFKA_OPTS to that place (&lt;A href="https://stackoverflow.com/questions/43190784/spark-streaming-kafka-kerberos" target="_blank"&gt;https://stackoverflow.com/questions/43190784/spark-streaming-kafka-kerberos&lt;/A&gt;)&lt;/LI&gt;&lt;LI&gt;I exported KAFKA_CLIENT_KERBEROS_PARAMS to be the same as KAFKA_OPTS (&lt;A href="https://community.hortonworks.com/content/supportkb/49422/running-kafka-client-bin-scripts-in-secure-envrion.html" target="_blank"&gt;https://community.hortonworks.com/content/supportkb/49422/running-kafka-client-bin-scripts-in-secure-envrion.html&lt;/A&gt;)&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Thu, 27 Jul 2017 15:29:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58110#M45040</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2017-07-27T15:29:22Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58111#M45041</link>
      <description>&lt;P&gt;This is indicating that your jaas.conf references a keytab that needs a password, or you are using ticket cache without doing a kinit before running this command.&lt;BR /&gt;&lt;BR /&gt;Confirm that you are able to connect to the cluster (hdfs dfs -ls /) from the command line first, and then check your jaas.conf based on this documentation:&lt;BR /&gt;&lt;A href="https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html" target="_blank"&gt;https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;-pd&lt;/P&gt;</description>
      <pubDate>Thu, 27 Jul 2017 15:31:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58111#M45041</guid>
      <dc:creator>pdvorak</dc:creator>
      <dc:date>2017-07-27T15:31:45Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58113#M45042</link>
      <description>&lt;P&gt;OK, finally got everything working.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As for the last error I had been seeing, I had thought for sure my kerberos credentials were still showing up in klist, but this morning when I kinited in, everything worked fine, so that must have been the issue.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I then got an error on the consumer side, which I soon realized was because with the new bootstrap-servers parameter, you need to use the same port as the producer (9093 in my case), not the zookeeper port. Once I updated this, everything worked properly.&lt;/P&gt;</description>
      <pubDate>Thu, 27 Jul 2017 15:39:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/58113#M45042</guid>
      <dc:creator>mcginnda</dc:creator>
      <dc:date>2017-07-27T15:39:18Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/61910#M45043</link>
      <description>&lt;P&gt;&amp;nbsp;hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;i have an issue on kafka, while running the stream from producer to consumer facing an error ,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;!--    bootstrap-theme.min.css   --&gt;&lt;!--    cisco.css   --&gt;&lt;!--    chatmsg.css    --&gt;&lt;/P&gt;&lt;DIV class="messageBox"&gt;&lt;DIV class="messageTableDiv"&gt;&lt;DIV&gt;&lt;DIV class="BubbleStyle_ConversationContainer BubbleStyle_LeftRightAligned"&gt;&lt;DIV class="BubbleStyle_DateContainer"&gt;&lt;DIV class="BubbleStyle_DateGroupsContainer BubbleStyle_Own"&gt;&lt;DIV class="BubbleStyle_BubbleContainer"&gt;&lt;DIV class="BubbleStyle_BubbleFrame"&gt;&lt;DIV class="BubbleStyle_MessageAndTimeContainer"&gt;&lt;DIV class="BubbleStyle_MessagesContainer"&gt;&lt;DIV class="BubbleStyle_MessageContainer"&gt;&lt;DIV&gt;org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after &lt;SPAN&gt;60000&lt;/SPAN&gt; ms&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV class="BubbleStyle_MessageContainer"&gt;&lt;DIV&gt;ERROR Error when sending message to topic binary_kafka_source with key: null, value: 175 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;please any one can help&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Thu, 16 Nov 2017 09:42:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/61910#M45043</guid>
      <dc:creator>infor</dc:creator>
      <dc:date>2017-11-16T09:42:10Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/61911#M45044</link>
      <description>and&lt;BR /&gt;7/11/16 12:23:23 INFO zkclient.ZkClient: zookeeper state changed (Disconnected)&lt;BR /&gt;17/11/16 12:23:23 INFO zkclient.ZkClient: zookeeper state changed (Disconnected)&lt;BR /&gt;17/11/16 12:23:23 INFO zkclient.ZkClient: zookeeper state changed (Disconnected)&lt;BR /&gt;17/11/16 12:23:24 INFO zkclient.ZkClient: zookeeper state changed (SyncConnected)&lt;BR /&gt;17/11/16 12:23:24 INFO zkclient.ZkClient: zookeeper state changed (SyncConnected)&lt;BR /&gt;17/11/16 12:23:24 INFO zkclient.ZkClient: zookeeper state changed (SyncConnected)&lt;BR /&gt;</description>
      <pubDate>Thu, 16 Nov 2017 09:43:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/61911#M45044</guid>
      <dc:creator>infor</dc:creator>
      <dc:date>2017-11-16T09:43:21Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/65903#M45045</link>
      <description>&lt;P&gt;Hi Team,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am getting below kafka exceptions in log, can anyone help me why we are getting below exceptions?&lt;/P&gt;&lt;P&gt;30 08:10:51.052 [Thread-13] org.apache.kafka.common.KafkaException: Failed to construct kafka producer&lt;/P&gt;&lt;P&gt;30 04:48:04.035 [Thread-1] org.apache.kafka.common.KafkaException: Failed to construct kafka consumer&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you all your help:&lt;/P&gt;</description>
      <pubDate>Fri, 30 Mar 2018 08:00:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/65903#M45045</guid>
      <dc:creator>SureshPallapolu</dc:creator>
      <dc:date>2018-03-30T08:00:42Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/65921#M45046</link>
      <description>There isn't enough information here to determine what the problem could be. If you can provide more log entries and your configuration, that may help.&lt;BR /&gt;&lt;BR /&gt;-pd</description>
      <pubDate>Fri, 30 Mar 2018 20:47:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/65921#M45046</guid>
      <dc:creator>pdvorak</dc:creator>
      <dc:date>2018-03-30T20:47:03Z</dc:date>
    </item>
    <item>
      <title>Re: Timeout Error When Using kafka-console-consumer and kafka-console-producer On Secured Cluster</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/84155#M45047</link>
      <description>&lt;P&gt;I have a very same problem with&amp;nbsp;&lt;A href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/22971" target="_self"&gt;&lt;SPAN class=""&gt;mcginnda&lt;/SPAN&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I try to config kafka broker support PLAINTXT and SSL at the same time,with server.properties config like these:&lt;/P&gt;&lt;P&gt;listeners=PLAINTEXT://test-ip:9092,SSL://test-ip:9093&lt;BR /&gt;advertised.listeners=PLAINTEXT://test-ip:9092,SSL://test-ip:9093&lt;BR /&gt;advertised.host.name=test-ip&lt;BR /&gt;delete.topic.enable=true&lt;/P&gt;&lt;P&gt;ssl.keystore.location=/kafka/ssl/server.keystore.jks&lt;BR /&gt;ssl.keystore.password=test1234&lt;BR /&gt;ssl.key.password=test1234&lt;BR /&gt;ssl.truststore.location=/kafka/ssl/server.truststore.jks&lt;BR /&gt;ssl.truststore.password=test1234&lt;BR /&gt;ssl.client.auth = required&lt;BR /&gt;ssl.enabled.protocols = TLSv1.2,TLSv1.1,TLSv1&lt;BR /&gt;ssl.keystore.type=JKS&lt;BR /&gt;ssl.truststore.type=JKS&lt;BR /&gt;ssl.secure.random.implementation=SHA1PRNG&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;and now, I try to use a consumer client to connect kafka server, but it not work.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;in server.log, there is a lot of error like this.&lt;/P&gt;&lt;P&gt;[2018-12-20 15:58:42,295] ERROR Processor got uncaught exception. (kafka.network.Processor)&lt;BR /&gt;java.lang.ArrayIndexOutOfBoundsException: 18&lt;BR /&gt;at org.apache.kafka.common.protocol.ApiKeys.forId(ApiKeys.java:68)&lt;BR /&gt;at org.apache.kafka.common.requests.AbstractRequest.getRequest(AbstractRequest.java:39)&lt;BR /&gt;at kafka.network.RequestChannel$Request.&amp;lt;init&amp;gt;(RequestChannel.scala:79)&lt;BR /&gt;at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:426)&lt;BR /&gt;at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:421)&lt;BR /&gt;at scala.collection.Iterator$class.foreach(Iterator.scala:742)&lt;BR /&gt;at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)&lt;BR /&gt;at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)&lt;BR /&gt;at scala.collection.AbstractIterable.foreach(Iterable.scala:54)&lt;BR /&gt;at kafka.network.Processor.run(SocketServer.scala:421)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:748)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;and consumer DEBUG error like this:&lt;/P&gt;&lt;P&gt;2018-12-20 16:04:08,103 DEBUG ZTE org.apache.kafka.common.network.Selector TransactionID=null InstanceID=null [] Connection with test-ip/110.10.10.100 disconnected [Selector.java] [307]&lt;BR /&gt;java.io.EOFException: null&lt;BR /&gt;at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99)&lt;BR /&gt;at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71)&lt;BR /&gt;at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:160)&lt;BR /&gt;at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:141)&lt;BR /&gt;at org.apache.kafka.common.network.Selector.poll(Selector.java:286)&lt;BR /&gt;at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:270)&lt;BR /&gt;at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.clientPoll(ConsumerNetworkClient.java:303)&lt;BR /&gt;at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:197)&lt;BR /&gt;at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:187)&lt;BR /&gt;at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:877)&lt;BR /&gt;at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:829)&lt;BR /&gt;at com.zte.polling.provider.kafka.KafkaClientProvider$$anonfun$receiveMessage$1$$anonfun$apply$mcV$sp$2.apply(KafkaClientProvider.scala:59)&lt;BR /&gt;at com.zte.polling.provider.kafka.KafkaClientProvider$$anonfun$receiveMessage$1$$anonfun$apply$mcV$sp$2.apply(KafkaClientProvider.scala:57)&lt;BR /&gt;at scala.collection.Iterator$class.foreach(Iterator.scala:727)&lt;BR /&gt;at com.zte.nfv.core.InfiniteIterate.foreach(InfiniteIterate.scala:4)&lt;BR /&gt;at com.zte.polling.provider.kafka.KafkaClientProvider$$anonfun$receiveMessage$1.apply$mcV$sp(KafkaClientProvider.scala:57)&lt;BR /&gt;at com.zte.polling.provider.kafka.KafkaClientProvider$$anonfun$receiveMessage$1.apply(KafkaClientProvider.scala:54)&lt;BR /&gt;at com.zte.polling.provider.kafka.KafkaClientProvider$$anonfun$receiveMessage$1.apply(KafkaClientProvider.scala:54)&lt;BR /&gt;at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)&lt;BR /&gt;at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)&lt;BR /&gt;at scala.concurrent.impl.ExecutionContextImpl$$anon$3.exec(ExecutionContextImpl.scala:107)&lt;BR /&gt;at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)&lt;BR /&gt;at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)&lt;BR /&gt;at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)&lt;BR /&gt;at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 20 Dec 2018 08:37:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Timeout-Error-When-Using-kafka-console-consumer-and-kafka/m-p/84155#M45047</guid>
      <dc:creator>simpleli</dc:creator>
      <dc:date>2018-12-20T08:37:40Z</dc:date>
    </item>
  </channel>
</rss>

