I was successfuly able to copy a file to Azure from Windows using Putty Copy (pscp). I used the following command
pscp - P 22 <file from local folder> tmp\
I could see that file in the sandbox host. Now when i tried to "put" the file from sandbox host to the sandbox itself (docker container), i am not able to do so (throws network not accessible error). The command below
Ambari is the web ui that is used to administer, monitor, and provision out a hadoop cluster. It also has the concept of VIEWs which allow for browsing the Hadoop Distributed filesystem (HDFS) as well as querying data through Hive, writing pig scripts amongst other things (even extensible to do something custom). Regardless within Ambari (example link - http:// TO AZURE PUBLIC IP>:8080/#/main/views/FILES/1.0.0/AUTO_FILES_INSTANCE )
You can log in as raj_ops (with password as raj_ops) in order to get to the files view. Don't just click the link but you'll have to change the above link to match your Azure sandbox Public IP address. This also assumes you have port 8080 open in Azures Network Security Group setting.
Hi dan thanks for your help just want to give you update about the process when i ran the first line ok commands it work but when i tried this docker cp /home/drice/test2.txt sandbox:dan/test2.txt. it said not such file exist though i used my directory.
When you ssh to port 2222, you are inside the sandbox container. Is the file there? if you ls -l tmp/ what do you get? which would be different than ls -l /tmp based on the pscp command you ran. you should be able to pscp directly to the container by going to port 2222.
pscp -P 2222 <file_from_local> /tmp
Then, in your shell you should be able to
[root@sandbox ~]# ls -l /tmp
-rw-r--r-- 1 root root 7540 Feb 27 10:00 file_from_local
Then I think you will want to copy it from the linux fs to the HDFS using hadoop command: