Member since
07-28-2015
5
Posts
3
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
24585 | 08-14-2015 07:07 AM | |
7737 | 07-28-2015 05:54 AM |
08-14-2015
07:07 AM
1 Kudo
Ok I figured out the issue, the entire URL is needed for the full path of the files in HDFS Port 8020 is for back end hdfs communication you cannot connect to that port directly. The URL to connect ot HDFS is listed below, please keep in mind these are the default ports and can be changed. Non-Secure: http://<namenode>:50070/webhdfs/v1/<directory> Secure: http://<namenode>:50470/webhdfs/v1/<directory> However my issue is when I go to create the connection from PowerBI I only have one option which is to input the server name no description or help of what goes in this field, or namenode where HDFS web is running. After inputting this info i get the below error. DataSource.Error: HDFS cannot connect to server 'namenode01.test.com'. Unable to connect to the remote server. Details: DataSourceKind=Hdfs DataSourcePath=http://namenode01.test.com:50070/webhdfs/v1 Url=http://namenode01.test.com:50070/webhdfs/v1/ Instead of putting only the server name I placed "http://<namenode>:50470/webhdfs/v1/<directory>" in the server field replacing namenode with the name of the server and the HDFS path of where I want to get the data. The issue now is PowerBI does not support Parquet or sequence file format, /cry, only text or open formats currently seem to work which is not unexpected. Thanks, Rusty
... View more
08-05-2015
10:26 AM
I am trying to connect Microsofts PowerBI desktop to HDFS, but I am unable to do so. I keep getting the error 400, I also try to connect using web instead of HDFS but get the same error. By default the HDFS data sorces set the port to 50070, the secure port is 50470 currently looking for a way to change that. I also have updated my JAAS settings to forward a kerberos ticket, but didnt have an effect. Has anyone tried this or made this to work on a secure cluster? Any ideas greatly appricaiated. Thanks,
... View more
07-28-2015
05:54 AM
Hey venu123, Hive does not pass though sentry so it will not adhere to any rules you set directly in sentry, it only looks at facl's. To manage hdfs permissions with sentry you have to enable the plugin for hdfs/sentry sync and configure it appropriately. With the sync enabled hive checks the configuration then references the group in sentry but the group will be applied authentically as a facl by sentry. To get items working use the "hadoop fs -setfacl" command to add the user as a facl. To have make the user add authentically as files are deleted and created add them to the default ACL on the root folder. (Please note this was hit and miss for me, sometimes worked other times did not) Example add to default ACL hadoop fs -setfacl -m -R default:username:r-x /<path>
... View more