<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Hortonworks sandbox docker is running but ambari and ssh is not working in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239201#M85602</link>
    <description>&lt;P&gt;
	Docker ps gives the following output. Both sandbox-hdp and sandbox-proxy are running. However ambari is not accessible. Neither am I able to do a ssh root@localhost -p 2222. It just hangs.&lt;/P&gt;&lt;PRE&gt;amedhi:HDP_3.0.1_docker-deploy-scripts_18120587fc7fb amedhi$ docker ps
CONTAINER ID        IMAGE                           COMMAND                  CREATED              STATUS              PORTS             NAMES
fac36a5d52c4        hortonworks/sandbox-proxy:1.0   "nginx -g 'daemon of…"   About a minute ago   Up 59 seconds       0.0.0.0:1080-&amp;gt;1080/tcp, 0.0.0.0:1100-&amp;gt;1100/tcp, 0.0.0.0:1111-&amp;gt;1111/tcp, 0.0.0.0:1988-&amp;gt;1988/tcp, 0.0.0.0:2100-&amp;gt;2100/tcp, 0.0.0.0:2181-2182-&amp;gt;2181-2182/tcp, 0.0.0.0:2201-2202-&amp;gt;2201-2202/tcp, 0.0.0.0:2222-&amp;gt;2222/tcp, 0.0.0.0:3000-&amp;gt;3000/tcp, 0.0.0.0:4040-&amp;gt;4040/tcp, 0.0.0.0:4200-&amp;gt;4200/tcp, 0.0.0.0:4242-&amp;gt;4242/tcp, 0.0.0.0:4557-&amp;gt;4557/tcp, 0.0.0.0:5007-&amp;gt;5007/tcp, 0.0.0.0:5011-&amp;gt;5011/tcp, 0.0.0.0:6001-&amp;gt;6001/tcp, 0.0.0.0:6003-&amp;gt;6003/tcp, 0.0.0.0:6008-&amp;gt;6008/tcp, 0.0.0.0:6080-&amp;gt;6080/tcp, 0.0.0.0:618
8-&amp;gt;6188/tcp, 0.0.0.0:6627-&amp;gt;6627/tcp, 0.0.0.0:6667-6668-&amp;gt;6667-6668/tcp, 0.0.0.0:7777-&amp;gt;7777/tcp, 0.0.0.0:7788-&amp;gt;7788/tcp, 0.0.0.0:8000-&amp;gt;8000/tcp, 0.0.0.0:8005-&amp;gt;8005/tcp, 0.0.0.0:8020-&amp;gt;8020/tcp, 0.0.0.0:8032-&amp;gt;8032/tcp, 0.0.0.0:8040-&amp;gt;8040/tcp, 0.0.0.0:8042-&amp;gt;8042/tcp, 0.0.0.0:8080-8082-&amp;gt;8080-8082/tcp, 0.0.0.0:8086-&amp;gt;8086/tcp, 0.0.0.0:8088-&amp;gt;8088/tcp, 0.0.0.0:8090-8091-&amp;gt;8090-8091/tcp, 0.0.0.0:8188-&amp;gt;8188/tcp, 0.0.0.0:8198-&amp;gt;8198/tcp, 0.0.0.0:8443-&amp;gt;8443/tcp, 0.0.0.0:8585-&amp;gt;8585/tcp, 0.0.0.0:8744-&amp;gt;8744/tcp, 0.0.0.0:8765-&amp;gt;8765/tcp, 0.0.0.0:8886-&amp;gt;8886/tcp, 0.0.0.0:8888-8889-&amp;gt;8888-8889/tcp, 0.0.0.0:8983-&amp;gt;8983/tcp, 0.0.0.0:8993-&amp;gt;8993/tcp, 0.0.0.0:9000-&amp;gt;9000/tcp, 0.0.0.0:9088-9091-&amp;gt;9088-9091/tcp, 0.0.0.0:9995-9996-&amp;gt;9995-9996/tcp, 0.0.0.0:10000-10002-&amp;gt;10000-10002/tcp, 0.0.0.0:10015-10016-&amp;gt;10015-10016/tcp, 0.0.0.0:10500-&amp;gt;10500/tcp, 0.0.0.0:10502-&amp;gt;10502/tcp, 0.0.0.0:11000-&amp;gt;11000/tcp, 0.0.0.0:12049-&amp;gt;12049/tcp, 0.0.0.0:12200-&amp;gt;12200/tcp, 0.0.0.0:15000-&amp;gt;15000/tcp, 0.0.0.0:15002-&amp;gt;15002/tcp, 0.0.0.0:15500-&amp;gt;15500/tcp, 0.0.0.0:16000-&amp;gt;16000/tcp, 0.0.0.0:16010-&amp;gt;16010/tcp, 0.0.0.0:16020-&amp;gt;16020/tcp, 0.0.0.0:16030-&amp;gt;16030/tcp, 0.0.0.0:18080-18081-&amp;gt;18080-18081/tcp, 0.0.0.0:19888-&amp;gt;19888/tcp, 0.0.0.0:21000-&amp;gt;21000/tcp, 0.0.0.0:30800-&amp;gt;30800/tcp, 0.0.0.0:33553-&amp;gt;33553/tcp, 0.0.0.0:39419-&amp;gt;39419/tcp, 0.0.0.0:42111-&amp;gt;42111/tcp, 0.0.0.0:50070-&amp;gt;50070/tcp, 0.0.0.0:50075-&amp;gt;50075/tcp, 0.0.0.0:50079-&amp;gt;50079/tcp, 0.0.0.0:50095-&amp;gt;50095/tcp, 0.0.0.0:50111-&amp;gt;50111/tcp, 0.0.0.0:60000-&amp;gt;60000/tcp, 0.0.0.0:60080-&amp;gt;60080/tcp, 0.0.0.0:61080-&amp;gt;61080/tcp, 80/tcp, 0.0.0.0:61888-&amp;gt;61888/tcp   sandbox-proxy
5d6bbab1d61c        hortonworks/sandbox-hdp:3.0.1   "/usr/sbin/init"         About a minute ago   Up About a minute   22/tcp, 4200/tcp, 8080/tcp       sandbox-hdp&lt;BR /&gt;&lt;/PRE&gt;</description>
    <pubDate>Fri, 16 Sep 2022 14:00:01 GMT</pubDate>
    <dc:creator>aasha_medhi2004</dc:creator>
    <dc:date>2022-09-16T14:00:01Z</dc:date>
    <item>
      <title>Hortonworks sandbox docker is running but ambari and ssh is not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239201#M85602</link>
      <description>&lt;P&gt;
	Docker ps gives the following output. Both sandbox-hdp and sandbox-proxy are running. However ambari is not accessible. Neither am I able to do a ssh root@localhost -p 2222. It just hangs.&lt;/P&gt;&lt;PRE&gt;amedhi:HDP_3.0.1_docker-deploy-scripts_18120587fc7fb amedhi$ docker ps
CONTAINER ID        IMAGE                           COMMAND                  CREATED              STATUS              PORTS             NAMES
fac36a5d52c4        hortonworks/sandbox-proxy:1.0   "nginx -g 'daemon of…"   About a minute ago   Up 59 seconds       0.0.0.0:1080-&amp;gt;1080/tcp, 0.0.0.0:1100-&amp;gt;1100/tcp, 0.0.0.0:1111-&amp;gt;1111/tcp, 0.0.0.0:1988-&amp;gt;1988/tcp, 0.0.0.0:2100-&amp;gt;2100/tcp, 0.0.0.0:2181-2182-&amp;gt;2181-2182/tcp, 0.0.0.0:2201-2202-&amp;gt;2201-2202/tcp, 0.0.0.0:2222-&amp;gt;2222/tcp, 0.0.0.0:3000-&amp;gt;3000/tcp, 0.0.0.0:4040-&amp;gt;4040/tcp, 0.0.0.0:4200-&amp;gt;4200/tcp, 0.0.0.0:4242-&amp;gt;4242/tcp, 0.0.0.0:4557-&amp;gt;4557/tcp, 0.0.0.0:5007-&amp;gt;5007/tcp, 0.0.0.0:5011-&amp;gt;5011/tcp, 0.0.0.0:6001-&amp;gt;6001/tcp, 0.0.0.0:6003-&amp;gt;6003/tcp, 0.0.0.0:6008-&amp;gt;6008/tcp, 0.0.0.0:6080-&amp;gt;6080/tcp, 0.0.0.0:618
8-&amp;gt;6188/tcp, 0.0.0.0:6627-&amp;gt;6627/tcp, 0.0.0.0:6667-6668-&amp;gt;6667-6668/tcp, 0.0.0.0:7777-&amp;gt;7777/tcp, 0.0.0.0:7788-&amp;gt;7788/tcp, 0.0.0.0:8000-&amp;gt;8000/tcp, 0.0.0.0:8005-&amp;gt;8005/tcp, 0.0.0.0:8020-&amp;gt;8020/tcp, 0.0.0.0:8032-&amp;gt;8032/tcp, 0.0.0.0:8040-&amp;gt;8040/tcp, 0.0.0.0:8042-&amp;gt;8042/tcp, 0.0.0.0:8080-8082-&amp;gt;8080-8082/tcp, 0.0.0.0:8086-&amp;gt;8086/tcp, 0.0.0.0:8088-&amp;gt;8088/tcp, 0.0.0.0:8090-8091-&amp;gt;8090-8091/tcp, 0.0.0.0:8188-&amp;gt;8188/tcp, 0.0.0.0:8198-&amp;gt;8198/tcp, 0.0.0.0:8443-&amp;gt;8443/tcp, 0.0.0.0:8585-&amp;gt;8585/tcp, 0.0.0.0:8744-&amp;gt;8744/tcp, 0.0.0.0:8765-&amp;gt;8765/tcp, 0.0.0.0:8886-&amp;gt;8886/tcp, 0.0.0.0:8888-8889-&amp;gt;8888-8889/tcp, 0.0.0.0:8983-&amp;gt;8983/tcp, 0.0.0.0:8993-&amp;gt;8993/tcp, 0.0.0.0:9000-&amp;gt;9000/tcp, 0.0.0.0:9088-9091-&amp;gt;9088-9091/tcp, 0.0.0.0:9995-9996-&amp;gt;9995-9996/tcp, 0.0.0.0:10000-10002-&amp;gt;10000-10002/tcp, 0.0.0.0:10015-10016-&amp;gt;10015-10016/tcp, 0.0.0.0:10500-&amp;gt;10500/tcp, 0.0.0.0:10502-&amp;gt;10502/tcp, 0.0.0.0:11000-&amp;gt;11000/tcp, 0.0.0.0:12049-&amp;gt;12049/tcp, 0.0.0.0:12200-&amp;gt;12200/tcp, 0.0.0.0:15000-&amp;gt;15000/tcp, 0.0.0.0:15002-&amp;gt;15002/tcp, 0.0.0.0:15500-&amp;gt;15500/tcp, 0.0.0.0:16000-&amp;gt;16000/tcp, 0.0.0.0:16010-&amp;gt;16010/tcp, 0.0.0.0:16020-&amp;gt;16020/tcp, 0.0.0.0:16030-&amp;gt;16030/tcp, 0.0.0.0:18080-18081-&amp;gt;18080-18081/tcp, 0.0.0.0:19888-&amp;gt;19888/tcp, 0.0.0.0:21000-&amp;gt;21000/tcp, 0.0.0.0:30800-&amp;gt;30800/tcp, 0.0.0.0:33553-&amp;gt;33553/tcp, 0.0.0.0:39419-&amp;gt;39419/tcp, 0.0.0.0:42111-&amp;gt;42111/tcp, 0.0.0.0:50070-&amp;gt;50070/tcp, 0.0.0.0:50075-&amp;gt;50075/tcp, 0.0.0.0:50079-&amp;gt;50079/tcp, 0.0.0.0:50095-&amp;gt;50095/tcp, 0.0.0.0:50111-&amp;gt;50111/tcp, 0.0.0.0:60000-&amp;gt;60000/tcp, 0.0.0.0:60080-&amp;gt;60080/tcp, 0.0.0.0:61080-&amp;gt;61080/tcp, 80/tcp, 0.0.0.0:61888-&amp;gt;61888/tcp   sandbox-proxy
5d6bbab1d61c        hortonworks/sandbox-hdp:3.0.1   "/usr/sbin/init"         About a minute ago   Up About a minute   22/tcp, 4200/tcp, 8080/tcp       sandbox-hdp&lt;BR /&gt;&lt;/PRE&gt;</description>
      <pubDate>Fri, 16 Sep 2022 14:00:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239201#M85602</guid>
      <dc:creator>aasha_medhi2004</dc:creator>
      <dc:date>2022-09-16T14:00:01Z</dc:date>
    </item>
    <item>
      <title>Re: Hortonworks sandbox docker is running but ambari and ssh is not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239202#M85603</link>
      <description>&lt;P&gt;Hi ,&lt;/P&gt;&lt;P&gt;    What is the error while accessing Ambari ?  &lt;/P&gt;</description>
      <pubDate>Wed, 19 Dec 2018 20:23:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239202#M85603</guid>
      <dc:creator>sandy605</dc:creator>
      <dc:date>2018-12-19T20:23:16Z</dc:date>
    </item>
    <item>
      <title>Re: Hortonworks sandbox docker is running but ambari and ssh is not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239203#M85604</link>
      <description>&lt;P&gt;No error. It just hangs. Same with ssh&lt;/P&gt;</description>
      <pubDate>Thu, 20 Dec 2018 12:27:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239203#M85604</guid>
      <dc:creator>aasha_medhi2004</dc:creator>
      <dc:date>2018-12-20T12:27:50Z</dc:date>
    </item>
    <item>
      <title>Re: Hortonworks sandbox docker is running but ambari and ssh is not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239204#M85605</link>
      <description>&lt;P&gt;The above issue is an intermittent issue. However now I am facing a different issue. I have a docker sandbox set up of hdp. I am trying to do a copyFromLocal (hdfs copy) from host machine. But it gives this error.&lt;/P&gt;&lt;PRE&gt;amedhi:~ amedhi$ hadoop fs -copyFromLocal trial.txt hdfs://sandbox-hdp.hortonworks.com:8020/tmp/
2018-12-19 14:27:33,103 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-12-19 14:28:34,062 INFO hdfs.DataStreamer: Exception in createBlockOutputStream blk_1073743875_3061
org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.18.0.2:50010]
                at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534)
                at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:253)
                at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1725)
                at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
                at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2018-12-19 14:28:34,063 WARN hdfs.DataStreamer: Abandoning BP-1419118625-172.17.0.2-1543512323726:blk_1073743875_3061
2018-12-19 14:28:34,078 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[172.18.0.2:50010,DS-6c34ba72-0587-4927-88a1-781ba7d444d9,DISK]
2018-12-19 14:28:34,105 WARN hdfs.DataStreamer: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/trial.txt._COPYING_ could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
                at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2121)
                at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:286)
                at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2706)
                at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875)
                at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561)
                at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
                at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
                at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
                at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:422)
                at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
                at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
 
                at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
                at org.apache.hadoop.ipc.Client.call(Client.java:1443)
                at org.apache.hadoop.ipc.Client.call(Client.java:1353)
                at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
                at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
                at com.sun.proxy.$Proxy11.addBlock(Unknown Source)
                at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:510)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
                at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
                at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
                at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
                at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
                at com.sun.proxy.$Proxy12.addBlock(Unknown Source)
                at org.apache.hadoop.hdfs.DFSOutputStream.addBlock(DFSOutputStream.java:1078)
                at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1865)
                at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1668)
                at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
copyFromLocal: File /tmp/trial.txt._COPYING_ could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation.


&lt;/PRE&gt;&lt;P&gt;I made the following fixes to solve the above issue. &lt;/P&gt;&lt;P&gt;1. Open http port 50010 on the docker&lt;/P&gt;&lt;P&gt;2. Add the property dfs.client.use.datanode.hostname to true in my host machine hadoop conf and also in ambari conf&lt;/P&gt;&lt;P&gt;Now I am facing the following issue. Please note that hadoop version is same in my host machine and docker&lt;/P&gt;&lt;PRE&gt;amedhi:hadoop amedhi$ hadoop fs -copyFromLocal hdfs-site.xml hdfs://sandbox-hdp.hortonworks.com:8020/tmp
2018-12-20 09:55:04,203 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-12-20 09:55:05,045 INFO hdfs.DataStreamer: Exception in createBlockOutputStream blk_1073743815_2999
com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
	at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
	at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos$BlockOpResponseProto.parseFrom(DataTransferProtos.java:23592)
	at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1761)
	at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2018-12-20 09:55:05,048 WARN hdfs.DataStreamer: Abandoning BP-1419118625-172.17.0.2-1543512323726:blk_1073743815_2999
2018-12-20 09:55:05,055 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[172.18.0.2:50010,DS-6c34ba72-0587-4927-88a1-781ba7d444d9,DISK]
2018-12-20 09:55:05,075 WARN hdfs.DataStreamer: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hdfs-site.xml._COPYING_ could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2121)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:286)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2706)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)


	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
	at org.apache.hadoop.ipc.Client.call(Client.java:1443)
	at org.apache.hadoop.ipc.Client.call(Client.java:1353)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
	at com.sun.proxy.$Proxy11.addBlock(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:510)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
	at com.sun.proxy.$Proxy12.addBlock(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.addBlock(DFSOutputStream.java:1078)
	at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1865)
	at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1668)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
copyFromLocal: File /tmp/hdfs-site.xml._COPYING_ could only be written to 0 of the 1 minReplication nodes. There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
&lt;/PRE&gt;</description>
      <pubDate>Thu, 20 Dec 2018 21:36:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239204#M85605</guid>
      <dc:creator>aasha_medhi2004</dc:creator>
      <dc:date>2018-12-20T21:36:50Z</dc:date>
    </item>
    <item>
      <title>Re: Hortonworks sandbox docker is running but ambari and ssh is not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239205#M85606</link>
      <description>&lt;P&gt;Stop and remove the docker and rerun the docker run script worked for me.&lt;/P&gt;</description>
      <pubDate>Thu, 03 Jan 2019 13:46:37 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hortonworks-sandbox-docker-is-running-but-ambari-and-ssh-is/m-p/239205#M85606</guid>
      <dc:creator>aasha_medhi2004</dc:creator>
      <dc:date>2019-01-03T13:46:37Z</dc:date>
    </item>
  </channel>
</rss>

