Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cannot connect to webhdfs

avatar
New Contributor

I am trying to connect to hdfs through webhdfs. I've checked webhdfs is enabled through the hdfs web UI. However, I get the error "couldn't connect to host" when I run this command:

 

curl -i "http://quickstart.cloudera:14000/webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS"

 

When I run curl against port 7180, it works as expected. How can I start the webhdfs?

10 REPLIES 10

avatar
Mentor
The HTTPFS port is 14000 (it serves a WebHDFS protocol also) but the regular non-gateway style WebHDFS serves from the NameNode's port of 50070 (or 50075 on a DN).

Could you try the below two variants?
1. curl -i "http://quickstart.cloudera:50070/webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS"
2. curl -i "http://localhost:14000/webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS"

Does either of these work? If not, please re-run with curl -v and post the output here.

avatar
New Contributor

Tried all of them and cannot connect.

avatar
Contributor

Here is the -v of the curl you gave:

[root@quickstart cloudera]# curl -i -v "http://quickstart.cloudera:50070/webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS"
* About to connect() to quickstart.cloudera port 50070 (#0)
*   Trying 10.0.2.15... Connection refused
* couldn't connect to host
* Closing connection #0
curl: (7) couldn't connect to host

avatar
Contributor

The answer from HTTPfs. Please help me!

 

* About to connect() to localhost port 14000 (#0)
*   Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 14000 (#0)
> GET /webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.13.6.0 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: localhost:14000
> Accept: */*
> 
< HTTP/1.1 401 Unauthorized
HTTP/1.1 401 Unauthorized
< Server: Apache-Coyote/1.1
Server: Apache-Coyote/1.1
< WWW-Authenticate: PseudoAuth
WWW-Authenticate: PseudoAuth
< Set-Cookie: hadoop.auth=; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly
Set-Cookie: hadoop.auth=; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly
< Content-Type: text/html;charset=utf-8
Content-Type: text/html;charset=utf-8
< Content-Length: 997
Content-Length: 997
< Date: Mon, 20 Jun 2016 09:53:43 GMT
Date: Mon, 20 Jun 2016 09:53:43 GMT

< 
* Connection #0 to host localhost left intact
* Closing connection #0
<html><head><title>Apache Tomcat/6.0.44 - Error report</title><style><!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}--></style> </head><body><h1>HTTP Status 401 - Authentication required</h1><HR size="1" noshade="noshade"><p><b>type</b> Status report</p><p><b>message</b> <u>Authentication required</u></p><p><b>description</b> <u>This request requires HTTP authentication.</u></p><HR size="1" noshade="noshade"><h3>Ap

avatar
Explorer

Hi,

Getting same erorr as user above,  my service is enabled so there seems to be something dumb i'm doing.

Any pointers?

Thanks,

Craig

 

 

The requested URL could not be retrieved


While trying to retrieve the URL: ****REMOVED****

The following error was encountered:

We can not connect to the server you have requested.


This means that:

•The server might be busy at this time.

•The server is not reachable.

 

Please try later to see if you can go through

avatar
Contributor

Hi Craig. Apologies for my lateness: I solved the issue following the crumbles left in the logs. In the end it was the namespace which couldn't start because something went wrong in the last partial save (I don't remember exactly how it is, but it's somewhere in the namenode folder). So I deleted the last checkpoint and it restarted from the second to last, starting it allright. Ref: http://stackoverflow.com/questions/37962314/hue-cannot-connect-error-on-connection-refused-hbase-can...

avatar
Explorer

Vale,

no worries, thanks for the response. All seems well with my services page, and the logs don't appear to be funky or anything.

I do get some connect failure messages from my curl request:

curl -i "http://127.0.0.1:1400/webhdfs/v1/user?op=GETCONTENTSUMMARY"

....

<!--Title-->
<table class='titleTable' >
  <tr>
    <td class='titleData'>
      Cannot Connect
    </td>
  </tr>
</table>
<!--/Title-->

<!--Content-->
<table class="contentTable">
  <tr>
    <td class="contentData">
      The proxy could not connect to the destination in time.
    </td>
  </tr>
</table>
<!--/Content-->

avatar
Hi,



Ran below curl commands, able to connected but not getting results. Just its throwong letter P along with special characters.

Env is Cloudera 6.1.2.

Want to use httpfs to put the files in HDFS.

Please suggest.



curl -i -X PUT -T /home/CORP/isuy/Ganesh.txt "http://aalhauap2u01:14000/hdfs:////user/isuy/testfiles?op=CREATE"

curl -i "http://aalhauap2g01:14000/webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS"



curl -i "http://aalhauap2u02:14000/webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS" -v
* About to connect() to aalhauap2u02 port 14000 (#0)
* Trying 10.91.23.152...
* Connected to aalhauap2u02 (10.91.23.152) port 14000 (#0)
> GET /webhdfs/v1/user/user.name=cloudera&op=GETFILESTATUS HTTP/1.1
> User-Agent: curl/7.29.0
> Host: aalhauap2u02:14000
> Accept: */*
>
* Connection #0 to host aalhauap2u02 left intact



Thanks,

avatar
Explorer

curl -X PUT -L --anyauth -u : -b cookie.jar "http://httpfs_ip:14000/webhdfs/v1/user/file.csv?op=CREATE&data=true&user.name=hdfs" --header "Content-Type:application/octet-stream" --header "Transfer-Encoding:chunked" -T "file.csv"

Just replace httpfs_ip and file.csv