Member since
01-13-2021
7
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1331 | 01-14-2021 06:06 AM |
01-14-2021
06:06 AM
Hi, i solve the problem. I add the port 50010 to the tcp-hdp.conf which is mapped to the docker container.
... View more
01-13-2021
10:51 PM
Hi, I have HDP 2.6.5 running in VirtualBox with the networkmode "bridgenetwork". I have opened port 50010 in the sandbox vm and the port is accessible. Now I want to create a text file from my Windows machine with a Java programme. Configuration conf = new Configuration(); conf.set("dfs.client.use.datanode.hostname", "true"); FileSystem fs = new DistributedFileSystem(); fs.initialize(new URI("hdfs://sandbox-hdp.hortonworks.com:8020"), conf); FSDataOutputStream fdos=fs.create(new Path("/testing/file02.txt"), true); fdos.writeBytes("Test text for the txt file"); fdos.flush(); fdos.close(); fs.close(); The text file is also created, but there is nothing in it and I get the following error: 21/01/13 19:13:00 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.io.EOFException: Premature EOF: no length prefix available at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2294) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1480) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1400) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554) 21/01/13 19:13:00 INFO hdfs.DFSClient: Abandoning BP-1389476969-172.18.0.2-1610559785364:blk_1073741850_1026 21/01/13 19:13:00 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[172.18.0.2:50010,DS-11d5c1d9-fa92-46fd-b130-cd05683300c7,DISK] 21/01/13 19:13:00 WARN hdfs.DFSClient: DataStreamer Exception org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /testing/file02.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1719) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3372) My local pc ip is 192.168.178.2 and the vm ip is 192.168.178.39. I have already tested everything recommended under this link. https://community.cloudera.com/t5/Support-Questions/Cannot-copy-from-local-machine-to-VM-datanode-via-Java/m-p/141141 But nothing has helped. Does anyone have an idea what the problem is? Thanks Marcel
... View more
Labels:
- Labels:
-
HDFS
-
Hortonworks Data Platform (HDP)
01-13-2021
10:37 PM
I find the solution in HDP 2.6.5 the ssh prort for the sandbox vm is the standard port 22.
... View more
01-13-2021
08:21 AM
I use the HDP 2.6.5 and i can not connect to the sandbox vm. Is there a change in Version 2.6.5? thx Marcel
... View more