[dlabuser1@edge-r2 ~]$ HADOOP_ROOT_LOGGER=DEBUG,console hdfs dfs -ls hdfs://juggernaut/ 18/10/26 16:32:12 DEBUG util.Shell: setsid exited with exit code 0 18/10/26 16:32:12 DEBUG conf.Configuration: parsing URL jar:file:/usr/hdp/2.5.6.0-40/hadoop/hadoop-common-2.7.3.2.5.6.0-40.jar!/core-default.xml 18/10/26 16:32:12 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@2641e737 18/10/26 16:32:13 DEBUG conf.Configuration: parsing URL file:/etc/hadoop/2.5.6.0-40/0/core-site.xml 18/10/26 16:32:13 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@221af3c0 18/10/26 16:32:13 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true 18/10/26 16:32:13 DEBUG util.KerberosName: Kerberos krb5 configuration not found, setting default realm to empty 18/10/26 16:32:13 DEBUG security.Groups: Creating new Groups object 18/10/26 16:32:13 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library... 18/10/26 16:32:13 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library 18/10/26 16:32:13 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution 18/10/26 16:32:13 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping 18/10/26 16:32:13 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000 18/10/26 16:32:13 DEBUG security.UserGroupInformation: hadoop login 18/10/26 16:32:13 DEBUG security.UserGroupInformation: hadoop login commit 18/10/26 16:32:13 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: dlabuser1 18/10/26 16:32:13 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: dlabuser1" with name dlabuser1 18/10/26 16:32:13 DEBUG security.UserGroupInformation: User entry: "dlabuser1" 18/10/26 16:32:13 DEBUG security.UserGroupInformation: Assuming keytab is managed externally since logged in from subject. 18/10/26 16:32:13 DEBUG security.UserGroupInformation: UGI loginUser:dlabuser1 (auth:SIMPLE) 18/10/26 16:32:13 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 18/10/26 16:32:13 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = true 18/10/26 16:32:13 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 18/10/26 16:32:13 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket 18/10/26 16:32:13 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null 18/10/26 16:32:13 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@389b0789 18/10/26 16:32:13 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@30b6ffe0 18/10/26 16:32:13 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@6331e426: starting with interruptCheckPeriodMs = 60000 18/10/26 16:32:13 DEBUG shortcircuit.DomainSocketFactory: The short-circuit local reads feature is enabled. 18/10/26 16:32:13 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 18/10/26 16:32:13 DEBUG ipc.Client: The ping interval is 60000 ms. 18/10/26 16:32:13 DEBUG ipc.Client: Connecting to juggernaut/10.201.101.221:8020 18/10/26 16:32:13 DEBUG ipc.Client: IPC Client (1774720883) connection to juggernaut/10.201.101.221:8020 from dlabuser1: starting, having connections 1 18/10/26 16:32:13 DEBUG ipc.Client: IPC Client (1774720883) connection to juggernaut/10.201.101.221:8020 from dlabuser1 sending #0 18/10/26 16:32:13 DEBUG ipc.Client: IPC Client (1774720883) connection to juggernaut/10.201.101.221:8020 from dlabuser1 got value #0 18/10/26 16:32:13 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 43ms 18/10/26 16:32:13 DEBUG azure.NativeAzureFileSystem: finalize() called. 18/10/26 16:32:13 DEBUG azure.NativeAzureFileSystem: finalize() called. 18/10/26 16:32:13 DEBUG ipc.Client: IPC Client (1774720883) connection to juggernaut/10.201.101.221:8020 from dlabuser1 sending #1 18/10/26 16:48:14 DEBUG ipc.Client: closing ipc connection to juggernaut/10.201.101.221:8020: Connection reset by peer java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:197) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380) at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131) at java.io.FilterInputStream.read(FilterInputStream.java:133) at java.io.FilterInputStream.read(FilterInputStream.java:133) at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:554) at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) at java.io.BufferedInputStream.read(BufferedInputStream.java:265) at java.io.DataInputStream.readInt(DataInputStream.java:387) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1119) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1014) 18/10/26 16:48:14 DEBUG ipc.Client: IPC Client (1774720883) connection to juggernaut/10.201.101.221:8020 from dlabuser1: closed 18/10/26 16:48:14 DEBUG ipc.Client: IPC Client (1774720883) connection to juggernaut/10.201.101.221:8020 from dlabuser1: stopped, remaining connections 0 18/10/26 16:48:14 DEBUG retry.RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getListing over null. Not retrying because try once and fail. java.io.IOException: Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: "edge-r2.datalab.smart.local.ph/10.126.3.41"; destination host is: "juggernaut":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558) at org.apache.hadoop.ipc.Client.call(Client.java:1498) at org.apache.hadoop.ipc.Client.call(Client.java:1398) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy10.getListing(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:618) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) at com.sun.proxy.$Proxy11.getListing(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2136) at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2119) at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:900) at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:113) at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:966) at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:962) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:962) at org.apache.hadoop.fs.shell.PathData.getDirectoryContents(PathData.java:268) at org.apache.hadoop.fs.shell.Command.recursePath(Command.java:373) at org.apache.hadoop.fs.shell.Ls.processPathArgument(Ls.java:220) at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271) at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255) at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119) at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:297) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:356) Caused by: java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:197) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380) at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131) at java.io.FilterInputStream.read(FilterInputStream.java:133) at java.io.FilterInputStream.read(FilterInputStream.java:133) at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:554) at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) at java.io.BufferedInputStream.read(BufferedInputStream.java:265) at java.io.DataInputStream.readInt(DataInputStream.java:387) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1119) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1014) ls: Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: "edge-r2.datalab.smart.local.ph/10.126.3.41"; destination host is: "juggernaut":8020; 18/10/26 16:48:14 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@30b6ffe0 18/10/26 16:48:14 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@30b6ffe0 18/10/26 16:48:14 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@30b6ffe0 18/10/26 16:48:14 DEBUG ipc.Client: Stopping client 18/10/26 16:48:14 DEBUG util.ShutdownHookManager: ShutdownHookManger complete shutdown.