Support Questions

Find answers, ask questions, and share your expertise

httpfs call from browser to hadoop apache

iam using hadoop apache 2.7.1 and i have a 3 nodes cluster

i configured httpfs and started it on name node (

we know that we can make webhdfs request from browser

example :if we want to open a file through webhdfs request

we call the following url from browser

and the response will be the dialog of save or open file

but if we are using httpfs ,can we make an httpfs request from browser

if i call the following request from browser

i get the following error

{"RemoteException":{"message":"org.apache.hadoop.fs.FileSystem: Provider not a subtype","exception":"ServiceConfigurationError","javaClassName":"java.util.ServiceConfigurationError"}}

and if i issue

i get the error

An error occurred during a connection to SSL received a record that exceeded the maximum permissible length. Error code: SSL_ERROR_RX_RECORD_TOO_LONG

so can httpfs request be done from browser ?

my core-site.xml has these httpfs properties



Super Guru

You are using HDInsight: not a subtype

Read this for Azure WebHDFS REST API Requirements:

Any other messages in the log? Do you have SSL setup on your cluster?

Are you logging into your cluster.

iam using hadoop 2.7.1 and not azure and this is the out put of hadoop version

Hadoop 2.7.1 Subversion -r 15ecc87ccf4a0228f35af08fc56de536e6ce657a Compiled by jenkins on 2015-06-29T06:04Z Compiled with protoc 2.5.0 From source with checksum fc0a1a23fc1868e4d5ee7fa2b28a58a

and i have no ssl cofigured ,and iam able to login to my cluster

but is this call


is a right call to send httpfs request from windows browser to hadoop cluster

or i can't make an httpfs request from windows browser becuase it depends on tomcat server

i solved this proplem by issuing and then then start even though i had applied this solution before but with and then then start and i don't see a differnce?but that worked for me,so restarting all services solved my proplem