Created 04-10-2017 02:12 PM
iam using hadoop apache 2.7.1 and i have a 3 nodes cluster
i configured httpfs and started it on name node (192.168.4.128)
we know that we can make webhdfs request from browser
example :if we want to open a file through webhdfs request
we call the following url from browser
http://192.168.4.128:50070/webhdfs/v1/hadoophome/myfile.txt/?user.name=root&op=OPEN
and the response will be the dialog of save or open file
but if we are using httpfs ,can we make an httpfs request from browser
if i call the following request from browser
http://192.168.4.128:14000/webhdfs/v1/hadoophome/myfile.txt/?op=open&user.name=root
i get the following error
{"RemoteException":{"message":"org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.azure.NativeAzureFileSystem not a subtype","exception":"ServiceConfigurationError","javaClassName":"java.util.ServiceConfigurationError"}}
and if i issue
https://192.168.4.128:14000/webhdfs/v1/aloosh/oula.txt/?op=open&user.name=root
i get the error
An error occurred during a connection to 192.168.4.128:14000. SSL received a record that exceeded the maximum permissible length. Error code: SSL_ERROR_RX_RECORD_TOO_LONG
so can httpfs request be done from browser ?
my core-site.xml has these httpfs properties
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
Created 04-10-2017 07:56 PM
You are using HDInsight: org.apache.hadoop.fs.azure.NativeAzureFileSystem not a subtype
Read this for Azure WebHDFS REST API Requirements:
https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-overview
Any other messages in the log? Do you have SSL setup on your cluster?
Are you logging into your cluster.
Created 04-11-2017 06:14 AM
iam using hadoop 2.7.1 and not azure and this is the out put of hadoop version
Hadoop 2.7.1 Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 15ecc87ccf4a0228f35af08fc56de536e6ce657a Compiled by jenkins on 2015-06-29T06:04Z Compiled with protoc 2.5.0 From source with checksum fc0a1a23fc1868e4d5ee7fa2b28a58a
and i have no ssl cofigured ,and iam able to login to my cluster
but is this call
is a right call to send httpfs request from windows browser to hadoop cluster
or i can't make an httpfs request from windows browser becuase it depends on tomcat server
Created 04-11-2017 08:20 AM
i solved this proplem by issuing stop-all.sh and then start-all.sh then httpfs.sh start even though i had applied this solution before but with stop-dfs.sh and then start-dfs.sh then httpfs.sh start and i don't see a differnce?but that worked for me,so restarting all services solved my proplem