Created on 02-04-2022 03:51 AM - last edited on 02-10-2022 12:07 PM by DianaTorres
Hi,
i am having the error message in the title, but I am stuck as I have already checked the followings:
* ranger permissions: cm_s3 / all - bucket, path -- the user in the the list for read / write / all permissions
* IDBroker role: user has a cdp-datalake-admin-role, which has a cdp-datalake-admin-policy-s3access
any other idea and what to check?
thanks
Created 02-11-2022 01:11 AM
Hi @aakulov ,
thanks for your reply.
In the last few days we had it fixed, thanks to CE support' help.
Not sure why, but he decided to reinstall Hive from scratch, and replace it with **Hive_on_Tez**.
The sqoop commands now seems to run fine, after updating the --hs2-url parameter accordingly (and upon regeneration of kerberos tickets for hive)
thanks anyway for your suggestions -- hope my answer will be useful to someone
kind regards,
gr
Created 02-10-2022 04:42 PM
HiveAccessControlException suggests you are accessing this s3 location through a SQL engine (Hive or Impala perhaps). Check in Ranger, under Hadoop SQL, if the policies are set properly there to access the table you are looking at.
Also, is this a RAZ-enabled environment, by any chance? If it is, please see here for RAZ setup specific to Hive table access: https://docs.cloudera.com/management-console/cloud/fine-grained-access-control-aws/topics/raz-aws-cr...
Hope this helps,
Alex
Created 02-11-2022 01:11 AM
Hi @aakulov ,
thanks for your reply.
In the last few days we had it fixed, thanks to CE support' help.
Not sure why, but he decided to reinstall Hive from scratch, and replace it with **Hive_on_Tez**.
The sqoop commands now seems to run fine, after updating the --hs2-url parameter accordingly (and upon regeneration of kerberos tickets for hive)
thanks anyway for your suggestions -- hope my answer will be useful to someone
kind regards,
gr