- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How to upload/download file from webserver(Windows) to Hortonworks sandbox hdfs ?
- Labels:
-
Apache Hadoop
Created ‎01-10-2016 01:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I want to make a software that let users upload a video file , process with Spark and send back the processed file. Until now i can upload files to windows webserver (tomcat). After the file is uploaded i want to send it to hdfs , processed it, and after processing send it back to hdfs and webserver .
1) It's possible to do such thing ? I mean is there a webservice to send files from windows to hdfs hosted on linux?
Is there a shorter path i can follow?
2) Suppose i can upload files to hdfs , how can i trigger the MapReduce job to run , process the video and send it back to webserver ?
Created ‎01-10-2016 02:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
for the file upload - you can leverage webhdfs https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
Oozie can help to schedule the workflow to do the processing
Created ‎01-10-2016 02:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Mihai Mihai take a look at webhdfs and webhcat APIs. Webhdfs Webhcat
Created ‎01-10-2016 02:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
for the file upload - you can leverage webhdfs https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
Oozie can help to schedule the workflow to do the processing
