Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

after upgrading HDP to 2.6.1 - insert command does not work if the data is located in S3

avatar
Contributor

Hello,

When I tried running the following command, I am getting the error:

alter table btest.testtable add IF NOT EXISTS partition (load_date='2017-06-19') location 's3a://testbucket/data/xxx/load_date=2017-06-19';

I am getting the following error:

Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [hive] does not have [READ] privilege on [s3a://testbucket/data/xxx/load_date=2017-06-19]

FYI:

Select statement works fine. I can run select statement having data located in S3. it is just that insert statement is failing. We are using ranger for authorization but hive user has full permission on all the databases, tables

1 ACCEPTED SOLUTION

avatar

@bhavik shah

1. Before you upgraded, were you able to read and write from that S3 location?

2. In HDP 2.6.1, if you want to use Ranger to control access to a specific S3 location, you can use the URL parameter when defining a Hive policy:

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/hive_policy.html

Are you using that? If not, you could try to create a new Hive policy for the hive user to access the S3 location. This feature is technical preview, however.

View solution in original post

2 REPLIES 2

avatar

@bhavik shah

1. Before you upgraded, were you able to read and write from that S3 location?

2. In HDP 2.6.1, if you want to use Ranger to control access to a specific S3 location, you can use the URL parameter when defining a Hive policy:

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/hive_policy.html

Are you using that? If not, you could try to create a new Hive policy for the hive user to access the S3 location. This feature is technical preview, however.

avatar
Contributor

Hello @Dominika Bialek thanks for the reponse. that was the issue. After adding S3 location, the issue resolved