<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: DataNode cannot send block report to NameNode due to third-party protobuf issues ? in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399787#M250548</link>
    <description>&lt;P&gt;&lt;STRONG&gt;&amp;nbsp;Solved:&amp;nbsp;&lt;/STRONG&gt;&lt;SPAN&gt;Simply replacing the Java version with 11 resolves the issue. It's crucial to check the Java version of the client that initialized the data into the cluster. This is the key point.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 08 Jan 2025 07:23:49 GMT</pubDate>
    <dc:creator>tuyen123</dc:creator>
    <dc:date>2025-01-08T07:23:49Z</dc:date>
    <item>
      <title>DataNode cannot send block report to NameNode due to third-party protobuf issues ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399497#M250498</link>
      <description>&lt;P&gt;org.apache.hadoop.hdfs.server.datanode.DataNode: Unsuccessfully sent block report 0x687e2eff1ccff4e2 with lease ID 0x2e42f5718005b5ef to namenode: hadoop-standby/:8020, containing 1 storage report(s), of which we sent 0. The reports had 659503 total blocks and used 0 RPC(s). This took 83 msecs to generate and 56 msecs for RPC and NN processing. Got back no commands.&lt;BR /&gt;2025-01-02 19:28:31,810 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: RemoteException in offerService&lt;BR /&gt;org.apache.hadoop.ipc.RemoteException(java.io.IOException): java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;&lt;BR /&gt;at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.runBlockOp(BlockManager.java:5558)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.blockReport(NameNodeRpcServer.java:1651)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.blockReport(DatanodeProtocolServerSideTranslatorPB.java:182)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:34769)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:621)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:589)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:573)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1227)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1246)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1169)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:3203)&lt;BR /&gt;Caused by: java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.IterableByteBufferInputStream.read(IterableByteBufferInputStream.java:143)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.read(CodedInputStream.java:2080)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.tryRefillBuffer(CodedInputStream.java:2831)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.refillBuffer(CodedInputStream.java:2777)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.readRawByte(CodedInputStream.java:2859)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.readRawVarint64SlowPath(CodedInputStream.java:2648)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.readRawVarint64(CodedInputStream.java:2641)&lt;BR /&gt;at org.apache.hadoop.thirdparty.protobuf.CodedInputStream$StreamDecoder.readSInt64(CodedInputStream.java:2497)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:419)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:397)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.processFirstBlockReport(BlockManager.java:3263)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.processReport(BlockManager.java:2945)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.lambda$blockReport$0(NameNodeRpcServer.java:1652)&lt;BR /&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$BlockReportProcessingThread.processQueue(BlockManager.java:5637)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$BlockReportProcessingThread.run(BlockManager.java:5614)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1584)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1529)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1426)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:258)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:139)&lt;BR /&gt;at com.sun.proxy.$Proxy17.blockReport(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.lambda$blockReport$2(DatanodeProtocolClientSideTranslatorPB.java:212)&lt;BR /&gt;at org.apache.hadoop.ipc.internal.ShadedProtobufHelper.ipc(ShadedProtobufHelper.java:160)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.blockReport(DatanodeProtocolClientSideTranslatorPB.java:212)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.blockReport(BPServiceActor.java:437)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:754)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:914)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:750)&lt;BR /&gt;&lt;BR /&gt;I have tried upgrading Java to the latest Java 8 version, but nothing has changed. Please help me.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 02 Jan 2025 12:55:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399497#M250498</guid>
      <dc:creator>tuyen123</dc:creator>
      <dc:date>2025-01-02T12:55:15Z</dc:date>
    </item>
    <item>
      <title>Re: DataNode cannot send block report to NameNode due to third-party protobuf issues ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399512#M250502</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/122977"&gt;@tuyen123&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="3"&gt;If you have installed other applications or dependencies&amp;nbsp; for Spark, Hive, etc. that use a different version of protobuf, the conflict can cause issues with the block report.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3"&gt;&lt;STRONG&gt;Locate conflicting protobuf JARs&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-SPOILER&gt;&lt;FONT size="3"&gt;&lt;STRONG&gt;find &lt;SPAN class="hljs-variable"&gt;$HADOOP_HOME&lt;/SPAN&gt; -name &lt;SPAN class="hljs-string"&gt;"protobuf*.jar"&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/LI-SPOILER&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV class="contain-inline-size rounded-md border-[0.5px] border-token-border-medium relative bg-token-sidebar-surface-primary dark:bg-gray-950"&gt;&lt;DIV class="overflow-y-auto p-4"&gt;&lt;FONT size="3"&gt;&lt;SPAN&gt;Check if there are multiple versions present in &lt;/SPAN&gt;&lt;FONT color="#FF0000"&gt;$HADOOP_HOME/lib&lt;/FONT&gt;&lt;SPAN&gt; or other dependency paths.&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;&lt;FONT size="3"&gt;Remove conflicting jars&amp;nbsp;Keep only the protobuf JAR version that matches your Hadoop distribution e.g.&lt;FONT color="#FF0000"&gt;protobuf-java-2.5.0.jar&lt;/FONT&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;FONT size="3"&gt;Alternatively, explicitly set the protobuf version in your &lt;FONT color="#FF0000"&gt;CLASSPATH.&lt;/FONT&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;FONT size="3"&gt;If third-party libraries are included in your Hadoop environment, they might override the correct protobuf version. Open $HADOOP_HOME/etc/hadoop/hadoop-env.sh and prepend the correct protobuf library:&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;export HADOOP_CLASSPATH=/path/to/protobuf-java-2.5.0.jar:$HADOOP_CLASSPATH&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;FONT size="3"&gt;Verify Classpath&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;LI-SPOILER&gt;&lt;FONT size="3"&gt;hadoop classpath | grep protobuf&lt;/FONT&gt;&lt;/LI-SPOILER&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;P&gt;&lt;FONT size="3"&gt;Ensure it includes the correct protobuf JAR.&lt;/FONT&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;FONT size="3"&gt;Please try that and revert. Happy hadooping&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 02 Jan 2025 18:59:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399512#M250502</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2025-01-02T18:59:14Z</dc:date>
    </item>
    <item>
      <title>Re: DataNode cannot send block report to NameNode due to third-party protobuf issues ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399526#M250506</link>
      <description>&lt;P&gt;&lt;STRONG&gt;Hi, Shelton&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I have checked the Hadoop version, and it is indeed using protobuf version 2.5.0, while the protobuf version for Spark could not be found.&lt;/P&gt;&lt;P&gt;The versions of Hadoop and Spark are 3.4.0 and 3.5.3, respectively.&lt;/P&gt;&lt;P&gt;I also checked the client using Java 11 to connect to the Spark master, which is running on Java 8.&lt;/P&gt;&lt;P&gt;I'm not sure if the difference in Java versions between the client and the master affects the connection.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Thank you for your response; it has enlightened me in many ways.&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 03 Jan 2025 06:46:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399526#M250506</guid>
      <dc:creator>tuyen123</dc:creator>
      <dc:date>2025-01-03T06:46:13Z</dc:date>
    </item>
    <item>
      <title>Re: DataNode cannot send block report to NameNode due to third-party protobuf issues ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399787#M250548</link>
      <description>&lt;P&gt;&lt;STRONG&gt;&amp;nbsp;Solved:&amp;nbsp;&lt;/STRONG&gt;&lt;SPAN&gt;Simply replacing the Java version with 11 resolves the issue. It's crucial to check the Java version of the client that initialized the data into the cluster. This is the key point.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 07:23:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/DataNode-cannot-send-block-report-to-NameNode-due-to-third/m-p/399787#M250548</guid>
      <dc:creator>tuyen123</dc:creator>
      <dc:date>2025-01-08T07:23:49Z</dc:date>
    </item>
  </channel>
</rss>

