Member since
01-10-2016
13
Posts
4
Kudos Received
0
Solutions
01-11-2016
07:19 PM
The port must be different . Finnaly i can upload files. After the first call you must take what is in the "Location" parameter of the response and make another call with the file (-T file) but instead of "sandbox.hortonworks.com" you must use machine ip. Here is an example: **************************** curl -i -X PUT "http://192.168.1.113:50070/webhdfs/v1/tmp/testDIR/test.txt?op=CREATE&user.name=hdfs" ***************************** HTTP/1.1 307 TEMPORARY_REDIRECT
Cache-Control: no-cache
Expires: Mon, 11 Jan 2016 19:02:58 GMT
Date: Mon, 11 Jan 2016 19:02:58 GMT
Pragma: no-cache
Expires: Mon, 11 Jan 2016 19:02:58 GMT
Date: Mon, 11 Jan 2016 19:02:58 GMT
Pragma: no-cache
Set-Cookie: hadoop.auth="u=hdfs&p=hdfs&t=simple&e=1452574978370&s=wWMnpbCQCrLuYr
SO5ViO6Vt8rzw="; Path=/; Expires=Tue, 12-Jan-2016 05:02:58 GMT; HttpOnly
Location: http://sandbox.hortonworks.com:50075/webhdfs/v1/tmp/testDIR/test.txt?o
p=CREATE&user.name=hdfs&namenoderpcaddress=sandbox.hortonworks.com:8020&createfl
ag=&createparent=true&overwrite=false
Content-Type: application/octet-stream
Content-Length: 0
Server: Jetty(6.1.26.hwx) ******************************************** C:\Users\...\Desktop>curl -i -X PUT -T test.txt "http://192.168.1.113:50075/webhdfs/v1/tmp/testDIR/test.txt?op=CREATE&user.name=hdfs&namenoderpcaddress=sandbox.hortonworks.com:8020&createflag=&createparent=true&overwrite=false"
HTTP/1.1 100 Continue
HTTP/1.1 201 Created
Location: hdfs://sandbox.hortonworks.com:8020/tmp/testDIR/test.txt
Content-Length: 0
Connection: close
... View more
01-10-2016
10:38 PM
@Artem Ervits sandbox.hortonworks.com it's already in /etc/hosts and the URL is the same as in directory creation request. Do you have other sugestions?
... View more
01-10-2016
10:10 PM
I try to upload a file to hdfs through webHDFS . Before file update i made a directory with webHDFS, "testDIR" in /tmp/testDIR and it works. First command is : curl -i -X PUT "http://192.168.1.112:50070/webhdfs/v1/tmp/testDIR/test.txt?op=CREATE" For the second command i'll use url from "Location" as in the instructions. Second commnd : curl -i -X PUT -T test.txt "http://sandbox.hortonworks.com:50075/webhdfs/v1/tmp/testDIR/test.txt?op=CREATE&namenoderpcaddress=sandbox.hortonworks.com:8020&createflag=&createparent=true&overwrite=false". This command response is : " Could not resolve host: sandbox.hortonworks.com What can i do ?
... View more
Labels:
- Labels:
-
Apache Hadoop
01-10-2016
01:56 PM
I want to make a software that let users upload a video file , process
with Spark and send back the processed file. Until now i can upload
files to windows
webserver (tomcat). After the file is uploaded i want to send it to hdfs
, processed it, and after processing send it back to hdfs and webserver
. 1) It's possible to do such thing ? I mean is there a webservice to send files from windows to hdfs hosted on linux? Is there a shorter path i can follow? 2) Suppose i can upload files to hdfs , how can i trigger the
MapReduce job to run , process the video and send it back to webserver ?
... View more
Labels:
- Labels:
-
Apache Hadoop