Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to upload data from local machine to HDFS on Azure - using command shell (Hortonworks Sandbox)

avatar
Explorer

I am trying to upload data from my local machine to HDFS on Azure using the command line.

This is a simple task when using the Ambari GUI - you simply hit browse and then select the file you want:

7107-capture.jpg

However, this is not so simple a task in the command shell it seems. I've tried various methods and the only information I can find online are solutions involving Azure's HDInsight.

For clarity:

Goal - upload a csv file from my local machine to HDFS on azure using the command line (Hortonworks Sandbox). The basic hdfs dfs -put command does not work as it can't find the file.

The reason I'd like to do this on the command line is because the Hortonworks cert exam is entirely in the shell - so I'd like to understand the solution.

Any assistance would be wonderful!

1 ACCEPTED SOLUTION

avatar
Super Guru

@Cameron Warren

You need to first scp your file to Azure. Once that's done, you can do "copyFromLocal" to copy file to your hdfs.

hdfs dfs -copyFromLocal /path/to/file /dest/path/on/hdfs

View solution in original post

1 REPLY 1

avatar
Super Guru

@Cameron Warren

You need to first scp your file to Azure. Once that's done, you can do "copyFromLocal" to copy file to your hdfs.

hdfs dfs -copyFromLocal /path/to/file /dest/path/on/hdfs