Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

remote access to clusters through hadoop fs -ls

avatar
Expert Contributor

I have two clusters A(unsecure) and B(secure):

I try to list directories of the cluster A from the cluster B:

I did this:

cluster B, master node machine# hadoop fs -ls hdfs://Cluster A IP address of master node:8020/tmp/

I got this error :

18/02/26 12:54:23 WARN ipc.Client: Failed to connect to server: 10.166.54.12/10.166.54.12:8020: try once and fail.
java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745)
        at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
        at org.apache.hadoop.ipc.Client.call(Client.java:1449)
        at org.apache.hadoop.ipc.Client.call(Client.java:1396)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
        at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:816)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
        at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2158)
        at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1423)
        at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1419)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1419)
        at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
        at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
        at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1674)
        at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
        at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
        at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
        at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
        at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
        at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:350)
18/02/26 12:54:23 WARN retry.RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over null. Not retrying because try once and fail.
java.net.ConnectException: Call From slzuyd5hmn03.yres.ytech/10.166.60.143 to 10.166.54.12:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
4 REPLIES 4

avatar
Master Mentor

@Yassine

Can you please try two things. (I am assuming that you are running HDFS command from secure cluster Node to connect to Non secure NameNode hence in that case we do not need to have kerberos ticket, But if it is vice-versa then we will need to have kerberos ticket as well)

1. Please make sure that the remote clusters IP Address (and Hostname) and port is accessible ? (this is to isolate firewall blocking issue)

# telnet  10.166.54.12  8020
# telnet  $REMOTE_NN_HOSTNAME  8020



2. Please try to specify the "--config" parameter to poiint to a directory where you have kept the HDFS configuration file of remote cluster.

# hadoop --config /PATH/TO/Remote_Clusters fs -ls /


Here please replace the "/PATH/TO/Remote_Clusters" with the path where you have kept the "hdfs-site.xml" and "core-site.xml" file of remote cluster.

.

avatar
Expert Contributor

@Jay Kumar SenSharma

I think that the telnet service is not installed in Muy cluster, could I try by ssh instead??

avatar
Master Mentor

@Yassine

Alternative of telnet can be "netcat" as following:

# nc -v  10.166.54.12  8020
# nc -v  $REMOTE_NN_HOSTNAME  8020

.

Also please check of on the remote NameNode if this port 8020 is actually bound to 0.0.0.0 address (or 10.166.54.12) or some other address? Following command we need to run on the host which has IP Address 10.166.54.12 to know if the port 8020 is actually opened and listening on which IP Address?

# netstat -tnlpa | grep 8020<br>

.

avatar
Expert Contributor

@Jay Kumar SenSharma

nc: connect to 10.166.54.12 port 8020 (tcp) failed: Connection refused

tcp        0      0 10.166.54.12:8020            0.0.0.0:*                   LISTEN      19578/java