Member since
07-18-2018
14
Posts
0
Kudos Received
0
Solutions
11-22-2018
08:51 PM
We cannot be sure of the reasons for this message with the snippet that you have provided. If you notice, the connection is being successfuly set but there is not response from DN. ~~~ java.nio.channels.SocketChannel[connected local=/172.31.15.196:50010 remote=/172.31.1.81:57017] ~~~ It can happen due to various reasons, like, the pipeline is interrupted, there are network congestions at play, the DN disk is not performing well, DN host OS is having issues like kernel soft lockups or just that the DN is too heavily loaded to respond back. You'd have to dig in more into the logs and look for more information. See the messages logged before the exception you're getting in the DN logs.
... View more
08-11-2018
02:16 AM
Since it wasn't really described how exactly did you resolve it... The point is that on the client side (it's important that it's not on the server side), set "dfs.datanode.use.datanode.hostname" in the org.apache.hadoop.conf.Configuration object to value "true". If the Configuration object isn't created by your code (like if Spark creates it, in my case), then it depends on what creates it... see its documentation. But some guesses: Attempt 1: Set it inside $HADOOP_HOME/etc/hadoop/hdfs-site.xml. Hadoop command line tools use that, your Java application though... maybe not. Attempt 2: Put $HADOOP_HOME/etc/hadoop/ into the Java classpath (or pack hdfs-site.xml into your project under /src/main/resources/, but that's kind of dirty...). This works with Spark. Spark only: SparkSession.builder().config("spark.hadoop.dfs.client.use.datanode.hostname", "true").[...] Of course, you may also need to add the domain name of the DataNode-s (as the NameNode knows it) into the /etc/hosts on the computer running your application.
... View more
07-30-2018
07:48 PM
Have you followed the solution made above? Depending on where you are trying to write into your cluster, unless you have full access to communicating with all your DataNode hosts and its ports, you will face this error.
... View more
07-18-2018
08:29 AM
Not anle read or write from to/from HDFS in CM - getting error: block size is 0 and no data is present .. When checking the CM UI all HDFS nodes are hving green status and no issue .
... View more