Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP services not working

Highlighted

HDP services not working

Hi all,

I have been trying to access my hadoop cluster and ran the following commands to find out why I couldn't connect to my cluster online: https://<ip address>: 8080

Commands:

hdfs dfsadmin -report

hadoop fs -ls

ps -ef|grep -i NameNode

As a result, I get the following errors:

17/02/09 17:40:50 INFO retry.RetryInvocationHandler: Exception while invoking getStats of class ClientNamenodeProtocolTranslatorPB over uzernshcm1.centrilogic.com/10.239.1.8:8020 after 4 fail over attempts. Trying to fail over after sleeping for 8039ms. java.net.ConnectException: Call From uzernshcm1.centrilogic.com/10.239.1.8 to uzernshcm1.centrilogic.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801) at org.apache.hadoop.net.NetU

Would you be able to shed insight as to why am I experiencing this problem and what are my options to remediate this and access my cluster?

8 REPLIES 8
Highlighted

Re: HDP services not working

Mentor

Are you trying to connect on port 8080 or 8020, just checking to make sure

Highlighted

Re: HDP services not working

Hey I used port 8080

Highlighted

Re: HDP services not working

Can you please share your core-site.xml & hdfs-site.xml files.

Also, can you attach the output of each of the above commands.

Highlighted

Re: HDP services not working

Hi Namit,

How do I get the core-site.xml and hdfs-site.xml files?

Highlighted

Re: HDP services not working

Its usually under /etc/hadoop/conf location

Highlighted

Re: HDP services not working

All I got was the following message:

# vi /etc/hadoop/conf location 2 files to edit

Also, the following commands and their outputs that you wanted to see:

# hdfs dfsadmin -report

17/02/09 22:09:05 INFO retry.RetryInvocationHandler: Exception while invoking getStats of class ClientNamenodeProtocolTranslatorPB over uzernshcm1.centrilogic.com/10.239.1.8:8020 after 2 fail over attempts. Trying to fail over after sleeping for 2381ms. java.net.ConnectException: Call From uzernshcm1.centrilogic.com/10.239.1.8 to uzernshcm1.centrilogic.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.call(Client.java:1430) at org.apache.hadoop.ipc.Client.call(Client.java:1363) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.getFsStats(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getStats(ClientNamenodeProtocolTranslatorPB.java:614) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) at com.sun.proxy.$Proxy11.getStats(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetStats(DFSClient.java:2538) at org.apache.hadoop.hdfs.DFSClient.getDiskStatus(DFSClient.java:2548) at org.apache.hadoop.hdfs.DistributedFileSystem.getStatus(DistributedFileSystem.java:1172) at org.apache.hadoop.fs.FileSystem.getStatus(FileSystem.java:2314) at org.apache.hadoop.hdfs.tools.DFSAdmin.report(DFSAdmin.java:454) at org.apache.hadoop.hdfs.tools.DFSAdmin.run(DFSAdmin.java:1797) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.hdfs.tools.DFSAdmin.main(DFSAdmin.java:1973) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:617) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:715) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:378) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1492) at org.apache.hadoop.ipc.Client.call(Client.java:1402) ... 20 more

Highlighted

Re: HDP services not working

# hadoop fs -ls

17/02/09 22:11:44 INFO retry.RetryInvocationHandler: Exception while invoking getFileInfo of class ClientNamenodeProtocolTranslatorPB over uzernshcm2.centrilogic.com/10.239.1.9:8020 after 1 fail over attempts. Trying to fail over after sleeping for 676ms. java.net.ConnectException: Call From uzernshcm1.centrilogic.com/10.239.1.8 to uzernshcm2.centrilogic.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.call(Client.java:1430) at org.apache.hadoop.ipc.Client.call(Client.java:1363) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:773) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2162) at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1363) at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1359) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1359) at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57) at org.apache.hadoop.fs.Globber.glob(Globber.java:252) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1655) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218) at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201) at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:287) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:340) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:617) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:715) at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:378) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1492) at org.apache.hadoop.ipc.Client.call(Client.java:1402) ... 28 more

Highlighted

Re: HDP services not working

In an HDP cluster, the port 8080 is typically used by Ambari, and after installing Ambari by default it's http, though https can be configured later as well. OTOH, the error you are getting when running hdfs commands is talking about port 8020 which is a default port used by the Name node. Ambari and hdfs are generally speaking unrelated in the sense that one can be down and other can be up and running. The error you are getting when you run hdfs commands "Connection Refused" can happen for a number of reasons, and they are pretty well documented in the URL mentioned in the error message: http://wiki.apache.org/hadoop/ConnectionRefused. You can start but checking whether your NN is up and running, is it listening on port 8020, is that port open for other Data nodes and clients for access, and so on.

Don't have an account?
Coming from Hortonworks? Activate your account here