Support Questions

Find answers, ask questions, and share your expertise

Regarding"http://hortonworks.com/hadoop-tutorial/how-to-refine-and-visualize-server-log-data/"

avatar
Contributor

Right now HDFS services are getting failed please see the logs-

0.1024069 secs] 91101K->9945K(245760K), 0.1024744 secs] [Times: user=0.32 sys=0.00, real=0.10 secs] 
Heap
 par new generation   total 92160K, used 35183K [0x00000000f0600000, 0x00000000f6a00000, 0x00000000f6a00000)
  eden space 81920K,  34% used [0x00000000f0600000, 0x00000000f218e640, 0x00000000f5600000)
  from space 10240K,  68% used [0x00000000f5600000, 0x00000000f5ccd8e8, 0x00000000f6000000)
  to   space 10240K,   0% used [0x00000000f6000000, 0x00000000f6000000, 0x00000000f6a00000)
 concurrent mark-sweep generation total 153600K, used 2978K [0x00000000f6a00000, 0x0000000100000000, 0x0000000100000000)
 Metaspace       used 21376K, capacity 21678K, committed 21884K, reserved 1069056K
  class space    used 2556K, capacity 2654K, committed 2688K, reserved 1048576K
==> /var/log/hadoop/hdfs/gc.log-201611240749 <==
Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8)
Memory: 4k page, physical 9064548k(2520228k free), swap 5119996k(5119996k free)
CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=262144000 -XX:MaxHeapSize=262144000 -XX:MaxNewSize=104857600 -XX:MaxTenuringThreshold=6 -XX:NewSize=52428800 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 
2016-11-24T07:49:27.202+0000: 1.402: [GC (Allocation Failure) 2016-11-24T07:49:27.203+0000: 1.402: [ParNew: 81920K->9191K(92160K), 0.0138715 secs] 81920K->9191K(245760K), 0.0139731 secs] [Times: user=0.03 sys=0.01, real=0.02 secs] 
2016-11-24T07:49:27.694+0000: 1.893: [GC (Allocation Failure) 2016-11-24T07:49:27.694+0000: 1.893: [ParNew: 91111K->7299K(92160K), 0.0262494 secs] 91111K->10275K(245760K), 0.0263202 secs] [Times: user=0.05 sys=0.00, real=0.02 secs] 
Heap
 par new generation   total 92160K, used 35518K [0x00000000f0600000, 0x00000000f6a00000, 0x00000000f6a00000)
  eden space 81920K,  34% used [0x00000000f0600000, 0x00000000f218ecd8, 0x00000000f5600000)
  from space 10240K,  71% used [0x00000000f5600000, 0x00000000f5d20d30, 0x00000000f6000000)
  to   space 10240K,   0% used [0x00000000f6000000, 0x00000000f6000000, 0x00000000f6a00000)
 concurrent mark-sweep generation total 153600K, used 2976K [0x00000000f6a00000, 0x0000000100000000, 0x0000000100000000)
 Metaspace       used 21335K, capacity 21622K, committed 21884K, reserved 1069056K
  class space    used 2555K, capacity 2654K, committed 2688K, reserved 1048576K
1 ACCEPTED SOLUTION

avatar
Contributor
  1. $ hdfs dfsadmin -report : I am getting -

16/11/24 08:12:07 WARN retry.RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getStats over null. Not retrying because try once and fail. java.net.ConnectException: Call From sandbox.hortonworks.com/172.17.0.2 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1556) at org.apache.hadoop.ipc.Client.call(Client.java:1496) at org.apache.hadoop.ipc.Client.call(Client.java:1396) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy10.getFsStats(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getStats(ClientNamenodeProtocolTranslatorPB.java:657) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176) at com.sun.proxy.$Proxy11.getStats(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetStats(DFSClient.java:2535) at org.apache.hadoop.hdfs.DFSClient.getDiskStatus(DFSClient.java:2545) at org.apache.hadoop.hdfs.DistributedFileSystem.getStatus(DistributedFileSystem.java:1231) at org.apache.hadoop.fs.FileSystem.getStatus(FileSystem.java:2335) at org.apache.hadoop.hdfs.tools.DFSAdmin.report(DFSAdmin.java:457) at org.apache.hadoop.hdfs.tools.DFSAdmin.run(DFSAdmin.java:1914) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.hdfs.tools.DFSAdmin.main(DFSAdmin.java:2107) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745) at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618) at org.apache.hadoop.ipc.Client.call(Client.java:1449) ... 21 more report: Call From sandbox.hortonworks.com/172.17.0.2 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused [hdfs@sandbox ~]$

View solution in original post

2 REPLIES 2

avatar
Master Mentor

@Bibhas Burman

You seem to have run out of disk space ! CAn you run this command and check the output assuming you are root

# su - hdfs
$ hdfs dfsadmin -report

or

hdfs dfs -du -h /

avatar
Contributor
  1. $ hdfs dfsadmin -report : I am getting -

16/11/24 08:12:07 WARN retry.RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getStats over null. Not retrying because try once and fail. java.net.ConnectException: Call From sandbox.hortonworks.com/172.17.0.2 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1556) at org.apache.hadoop.ipc.Client.call(Client.java:1496) at org.apache.hadoop.ipc.Client.call(Client.java:1396) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy10.getFsStats(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getStats(ClientNamenodeProtocolTranslatorPB.java:657) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176) at com.sun.proxy.$Proxy11.getStats(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetStats(DFSClient.java:2535) at org.apache.hadoop.hdfs.DFSClient.getDiskStatus(DFSClient.java:2545) at org.apache.hadoop.hdfs.DistributedFileSystem.getStatus(DistributedFileSystem.java:1231) at org.apache.hadoop.fs.FileSystem.getStatus(FileSystem.java:2335) at org.apache.hadoop.hdfs.tools.DFSAdmin.report(DFSAdmin.java:457) at org.apache.hadoop.hdfs.tools.DFSAdmin.run(DFSAdmin.java:1914) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.hdfs.tools.DFSAdmin.main(DFSAdmin.java:2107) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745) at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618) at org.apache.hadoop.ipc.Client.call(Client.java:1449) ... 21 more report: Call From sandbox.hortonworks.com/172.17.0.2 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused [hdfs@sandbox ~]$