We have a cluster on Azure with ADLS as a storage layer. I have updated core-site.xml with the keys so that I can access ADLS from hdfs cli. However, when I try to create an external table on hive using beeline from a csv file that is there on ADLS folder, I see a permission error which indicates that the end user doesn't have permission to read that csv file.
I also checked to make sure hive impersonation is set to true that means it would be the hive user which needs to have access to the file but not the end user. But in this case, I am not sure why i see an error message about the end user not having permissions to read that file on ADLS. Can some shed some light on this and point me in the right direction?
We have the cluster running with HDP 2.6.4 on Azure.
Here is the SQL that i am using to create an external table ( i connected to hive via beeline as user 'ravi' )
CREATE EXTERNAL TABLE IF NOT EXISTS Cars(
Name STRING,
Miles_per_Gallon INT,
Cylinders INT,
Displacement INT,
Horsepower INT,
Weight_in_lbs INT,
Acceleration DECIMAL,
Year DATE,
Origin CHAR(1))
COMMENT 'Data about cars from a public database'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED AS TEXTFILE
location 'adl://hdpadls.azuredatalakestore.net/folder1';
Error Message:
Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [ravi] does not have [ALL] privilege on [adl://hdpadls.azuredatalakestore.net/folder1] (state=42000,code=40000)