Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

hdfs upload from local computer to vm hdfs doesnt work.

Explorer

Im trying to fix this issue for a couple of days and I tried many things but non seem to work.

when trying to move any file from my local computer to the vm by using hdfs dfs -put /someRandomFile

hdfs://sandbox.hortonworks.com/tmp (in /etc/hosts sandbox.hortonworks.com=127.0.0.1) I get this error:

18/01/23 14:45:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

18/01/23 14:45:32 INFO hdfs.DataStreamer: Exception in createBlockOutputStream

java.io.IOException: Connection reset by peer

	at sun.nio.ch.FileDispatcherImpl.read0(Native Method)

	at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)

	at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)

	at sun.nio.ch.IOUtil.read(IOUtil.java:197)



	at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)

	at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)

	at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)

	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)

	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)

	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118)

	at java.io.FilterInputStream.read(FilterInputStream.java:83)

	at java.io.FilterInputStream.read(FilterInputStream.java:83)

	at org.apache.hadoop.hdfs.protocolPB.PBHelperClient.vintPrefixed(PBHelperClient.java:398)

	at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1698)

	at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1619)

	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)

18/01/23 14:45:32 WARN hdfs.DataStreamer: Abandoning BP-1691134265-172.17.0.2-1510324659694:blk_1073742750_1935

18/01/23 14:45:32 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[172.17.0.2:50010,DS-526760e8-383f-41d4-9009-32a6ade1405e,DISK]

18/01/23 14:45:32 WARN hdfs.DataStreamer: DataStreamer Exception

org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/.profile._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1719)

	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3368)

	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3292)

	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:850)

	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:504)

	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)

	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)

	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)

	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)

	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)

	at java.security.AccessController.doPrivileged(Native Method)

	at javax.security.auth.Subject.doAs(Subject.java:422)

	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)

	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)




	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1483)

	at org.apache.hadoop.ipc.Client.call(Client.java:1429)

	at org.apache.hadoop.ipc.Client.call(Client.java:1339)

	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)

	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)

	at com.sun.proxy.$Proxy10.addBlock(Unknown Source)

	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:440)

	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

	at java.lang.reflect.Method.invoke(Method.java:498)

	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409)

	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)

	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)

	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)

	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346)

	at com.sun.proxy.$Proxy11.addBlock(Unknown Source)

	at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1809)

	at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1609)

	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)

put: File /tmp/.profile._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

I tried taking firewall off , adding the dfs.client.use.datanode.hostname property synch timedate and more but nothing seems to work. does anyone know what may cause this issue?

5 REPLIES 5

Expert Contributor

I can't see the error, there is just Warning in your output !!

Explorer

not sure what you mean but at least from my end i can view all the error

New Contributor

Please try below and acknowledge

hdfs dfs -put /someRandomFile .

or

hadoof fs -put <source absolute path> <destination path>

Explorer

Hey Manmad if Ill do what you offered It will just upload the file to my local hdfs and not to the vm hdfs.
and in my version of hdfs you cant use module fs it doesnt exist.

Explorer

anyone has any idea?

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.