Support Questions
Find answers, ask questions, and share your expertise

Unexpected HDFS error output just after calling 'CloseFile' returned expected void

Highlighted

Unexpected HDFS error output just after calling 'CloseFile' returned expected void

New Contributor

Hello,

I'm using HDP_3.0.1_sandbox in docker and when I try to upload some CSV files to the HDFS I receive the following error:

 Unexpected HDFS error output just after calling 'CloseFile' returned expected void: \
          FSDataOutputStream#close error: \
          org.apache.hadoop.ipc.RemoteException(java.io.IOException): File integ-hive.f9 could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation. \
              at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2121) \
              at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:286) \
              at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2706) \
              at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875) \
              at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561) \
              at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) \
              at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) \
              at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) \
              at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) \
              at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) \
              at java.security.AccessController.doPrivileged(Native Method) \
              at javax.security.auth.Subject.doAs(Subject.java:422) \
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) \
              at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) \
           \
              at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) \
              at org.apache.hadoop.ipc.Client.call(Client.java:1435) \
              at org.apache.hadoop.ipc.Client.call(Client.java:1345) \
              at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) \
              at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) \
              at com.sun.proxy.$Proxy10.addBlock(Unknown Source) \
              at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) \
              at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) \
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) \
              at java.lang.reflect.Method.invoke(Method.java:498) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) \
              at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) \
              at com.sun.proxy.$Proxy11.addBlock(Unknown Source) \
              at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) \
              at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) \
              at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)
         2019-07-01T14:52:05+02:00: Exception in createBlockOutputStream
         2019-07-01T14:52:05+02:00: org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.18.0.2:50010]
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)
         2019-07-01T14:52:05+02:00: Abandoning BP-1419118625-172.17.0.2-1543512323726:blk_1073779748_38964
         2019-07-01T14:52:05+02:00: Excluding datanode DatanodeInfoWithStorage[172.18.0.2:50010,DS-6c34ba72-0587-4927-88a1-781ba7d444d9,DISK]
         2019-07-01T14:52:05+02:00: DataStreamer Exception
         2019-07-01T14:52:05+02:00: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File integ-hive.f10 could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2121)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:286)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2706)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
         2019-07-01T14:52:05+02:00:     at java.security.AccessController.doPrivileged(Native Method)
         2019-07-01T14:52:05+02:00:     at javax.security.auth.Subject.doAs(Subject.java:422)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
         2019-07-01T14:52:05+02:00: hive[java]:
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.Client.call(Client.java:1435)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.Client.call(Client.java:1345)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
         2019-07-01T14:52:05+02:00:     at com.sun.proxy.$Proxy10.addBlock(Unknown Source)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444)
         2019-07-01T14:52:05+02:00:     at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         2019-07-01T14:52:05+02:00:     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         2019-07-01T14:52:05+02:00:     at java.lang.reflect.Method.invoke(Method.java:498)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346)
         2019-07-01T14:52:05+02:00:     at com.sun.proxy.$Proxy11.addBlock(Unknown Source)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638)
         2019-07-01T14:52:05+02:00:     at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)

Please clarify what can be wrong?