Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Copy file from Windows to sandbox hosted in Azure


I was successfuly able to copy a file to Azure from Windows using Putty Copy (pscp). I used the following command

pscp - P 22 <file from local folder> tmp\

I could see that file in the sandbox host. Now when i tried to "put" the file from sandbox host to the sandbox itself (docker container), i am not able to do so (throws network not accessible error). The command below

hadoop fs -ls -put \tmp\onecsvfile.csv \tmp\

hadoop fs -ls -put \tmp\onecsvfile.csv root@localhost:2222\\tmp\

Nothing works 😞

Can you tell me how can i move the file from sandbox host to the sandbox (Sandbox 2.5 hosted in Azure)?



Hi @Prasanna G!

You have the following options:

1. You can use "docker cp" if you have only one file. Here is the documentation with examples.

2. The most flexible solution is mounting local drives to the sandbox with the -v option of "docker run". This is preferred because all the content in the folder is accessible inside the container.

docker run -v /Users/<path>:/<container path> ...

You can find an example how to configure this for Sandbox here.

View solution in original post



@Prasanna G Have you started Sandbox from Azure Marketplace?

If so, you need to manually install Docker, as described here.

You have the option to install Docker extension with VM install, it should work as well:



Hello @pdarvasi , the docker is installed as part of the Azure Sandbox. I can see the docker service running. And i was also able to run the docker cp command once i did "sudo su"

Expert Contributor

@Prasanna G,

When you ssh to port 2222, you are inside the sandbox container. Is the file there? if you ls -l tmp/ what do you get? which would be different than ls -l /tmp based on the pscp command you ran. you should be able to pscp directly to the container by going to port 2222.

pscp -P 2222 <file_from_local> /tmp

Then, in your shell you should be able to

[root@sandbox ~]# ls -l /tmp
total 548
-rw-r--r-- 1 root       root     7540 Feb 27 10:00 file_from_local

Then I think you will want to copy it from the linux fs to the HDFS using hadoop command:

[root@sandbox ~]# hadoop fs -put /tmp/file_from_local /tmp
[root@sandbox ~]# hadoop fs -ls /tmp
-rw-r--r--   1 root      hdfs          0 2017-03-01 20:43 /tmp/file_from_local