<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Unable to read Kafka topic messages in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286410#M212433</link>
    <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The issue has got fixed.&amp;nbsp; It was due to Spark Executor JVM Option being set incorrectly.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks and Regards,&lt;/P&gt;&lt;P&gt;Sudhindra&lt;/P&gt;</description>
    <pubDate>Fri, 27 Dec 2019 07:05:16 GMT</pubDate>
    <dc:creator>ssk26</dc:creator>
    <dc:date>2019-12-27T07:05:16Z</dc:date>
    <item>
      <title>Unable to read Kafka topic messages</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286211#M212289</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am working on one of the Kafka use-cases in my project, where the scenario is as follows:&lt;/P&gt;&lt;P&gt;Step 1:&amp;nbsp; I have to update records of a table (lets say CUSTOMER) in web portal&amp;nbsp;&lt;/P&gt;&lt;P&gt;Step 2:&amp;nbsp; A Spark Streaming job will be running which will capture the DES (Data Event Streaming) eventId related to the above.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Step 3:&amp;nbsp; It will connect to the Broker at port 9092, pull the messages, process them and put them as records in one of RDBMS table.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This entire scenario is being executed in Spark Cluster setup in-side Kubernetes.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The same job is running perfectly fine outside Kubernetes.&amp;nbsp; &amp;nbsp;But, inside Kubernetes, although the job is able to connect to Broker at port 9092 without any issues, it's not reading any real-time events and I am getting a loop of the below messages:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;19/12/23 16:00:00.180 DEBUG Fetcher: [Consumer clientId=consumer-1, groupId=test] Added READ_UNCOMMITTED fetch request for partition table-update-0 at offset 102 to node 10.20.0.44:29092 (id: 1 rack: null)&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;19/12/23 16:00:00.180 DEBUG FetchSessionHandler: [Consumer clientId=consumer-1, groupId=test] Built incremental fetch (sessionId=96537117, epoch=359) for node 1. Added 0 partition(s), altered 0 partition(s), removed 0 partition(s) out of 1 partition(s)&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;19/12/23 16:00:00.180 DEBUG Fetcher: [Consumer clientId=consumer-1, groupId=test] Sending READ_UNCOMMITTED IncrementalFetchRequest(toSend=(), toForget=(), implied=(table-update-0)) to broker 10.20.0.44:29092 (id: 1 rack: null)&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;19/12/23 16:00:00.246 DEBUG TaskSchedulerImpl: parentName: , name: TaskSet_0.0, runningTasks: 0&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;19/12/23 16:00:00.683 DEBUG FetchSessionHandler: [Consumer clientId=consumer-1, groupId=test] Node 1 sent an incremental fetch response for session 96537117 with 0 response partition(s), 1 implied partition(s)&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I need your help in understanding what could be wrong with the setup in Kubernetes.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please let me know if any additional information is required.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks and Regards,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Sudhindra&lt;/P&gt;</description>
      <pubDate>Tue, 24 Dec 2019 04:23:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286211#M212289</guid>
      <dc:creator>ssk26</dc:creator>
      <dc:date>2019-12-24T04:23:36Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to read Kafka topic messages</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286214#M212292</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you please ensure if spark streaming connect to the right Kafka broker host; Check if&amp;nbsp;&lt;STRONG&gt;10.20.0.44:29092 &lt;/STRONG&gt;is the correct IP:port.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also please monitor Kafka broker logs to verify if Spark streaming job is connected to Kafka broker.&lt;/P&gt;</description>
      <pubDate>Tue, 24 Dec 2019 05:03:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286214#M212292</guid>
      <dc:creator>senthh</dc:creator>
      <dc:date>2019-12-24T05:03:39Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to read Kafka topic messages</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286215#M212293</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/30744"&gt;@senthh&lt;/a&gt;,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks a lot for your reply.&amp;nbsp; I have monitored the Spark Streaming logs and verified that the connection to broker is established correctly.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Given below are the logs confirming the same:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The interesting thing is the same Spark Streaming job is working outside Kubernetes setup, without any issues.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please help!!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;9/12/23 15:56:59.562 INFO AppInfoParser: Kafka version: 2.2.1&lt;BR /&gt;19/12/23 15:56:59.563 INFO AppInfoParser: Kafka commitId: 55783d3133a5a49a&lt;BR /&gt;19/12/23 15:56:59.566 DEBUG KafkaConsumer: [Consumer clientId=consumer-1, groupId=test] Kafka consumer initialized&lt;BR /&gt;19/12/23 15:56:59.569 INFO KafkaConsumer: [Consumer clientId=consumer-1, groupId=test] Subscribed to partition(s): table-update-0&lt;BR /&gt;19/12/23 15:56:59.593 DEBUG NetworkClient: [Consumer clientId=consumer-1, groupId=test] &lt;STRONG&gt;Initialize connection to node 10.20.0.44:29092 (id: -1 rack: null) for sending metadata request&lt;/STRONG&gt;&lt;BR /&gt;19/12/23 15:56:59.596 DEBUG NetworkClient: [Consumer clientId=consumer-1, groupId=test] Initiating connection to node 10.20.0.44:29092 (id: -1 rack: null) using address /10.20.0.44&lt;BR /&gt;19/12/23 15:56:59.640 DEBUG Metrics: Added sensor with name node--1.bytes-sent&lt;BR /&gt;19/12/23 15:56:59.642 DEBUG Metrics: Added sensor with name node--1.bytes-received&lt;BR /&gt;19/12/23 15:56:59.642 DEBUG Metrics: Added sensor with name node--1.latency&lt;BR /&gt;19/12/23 15:56:59.643 DEBUG Selector: [Consumer clientId=consumer-1, groupId=test] Created socket with SO_RCVBUF = 65536, SO_SNDBUF = 131072, SO_TIMEOUT = 0 to node -1&lt;BR /&gt;19/12/23 15:56:59.928 DEBUG NetworkClient: [Consumer clientId=consumer-1, groupId=test] &lt;STRONG&gt;Completed connection to node -1. Fetching API versions.&lt;/STRONG&gt;&lt;BR /&gt;19/12/23 15:56:59.929 DEBUG NetworkClient: [Consumer clientId=consumer-1, groupId=test] Initiating API versions fetch from node -1.&lt;BR /&gt;19/12/23 15:56:59.962 DEBUG NetworkClient: [Consumer clientId=consumer-1, groupId=test] Recorded API versions for node -1: (Produce(0): 0 to 7 [usable: 7], Fetch(1): 0 to 10 [usable: 10], ListOffsets(2): 0 to 5 [usable: 5], Metadata(3): 0 to 7 [usable: 7], LeaderAndIsr(4): 0 to 2 [usable: 2], StopReplica(5): 0 to 1 [usable: 1], UpdateMetadata(6): 0 to 5 [usable: 5], ControlledShutdown(7): 0 to 2 [usable: 2], OffsetCommit(8): 0 to 6 [usable: 6], OffsetFetch(9): 0 to 5 [usable: 5], FindCoordinator(10): 0 to 2 [usable: 2], JoinGroup(11):&lt;/P&gt;</description>
      <pubDate>Tue, 24 Dec 2019 05:27:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286215#M212293</guid>
      <dc:creator>ssk26</dc:creator>
      <dc:date>2019-12-24T05:27:42Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to read Kafka topic messages</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286410#M212433</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The issue has got fixed.&amp;nbsp; It was due to Spark Executor JVM Option being set incorrectly.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks and Regards,&lt;/P&gt;&lt;P&gt;Sudhindra&lt;/P&gt;</description>
      <pubDate>Fri, 27 Dec 2019 07:05:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Unable-to-read-Kafka-topic-messages/m-p/286410#M212433</guid>
      <dc:creator>ssk26</dc:creator>
      <dc:date>2019-12-27T07:05:16Z</dc:date>
    </item>
  </channel>
</rss>

