- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How to configure multiple S3 end-points in a multi-tenanted cluster?
- Labels:
-
Hortonworks Data Platform (HDP)
Created on
‎06-16-2017
10:58 AM
- last edited on
‎08-28-2019
07:21 AM
by
cjervis
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a multi-tenanted HDP2.3 cluster. It has been configured with an S3 end-point in custom hdfs-site.xml. Is it possible to add another S3 end-point for another tenant? If so, what should be the property name?
Thanks in Advance.
Created ‎06-16-2017 06:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Phoncy Joseph In HDP 2.6.1 you can set per-bucket properties to authenticate with multiple buckets. See https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_cloud-data-access/content/s3-auth-per-bu....
However, I believe that this was introduced in HDP 2.5 or 2.6 so it is most likely not available in HDP 2.3.
CC @stevel
Created ‎08-28-2019 04:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have using HDP 2.5 but when i configured multiple buckets after that doing LS it throwing access denied error 403.
Created ‎06-20-2017 05:13 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Phoncy Joseph To add to the previous answer:
If you upgrade to HDP 2.6.1, there is a new Ranger feature that allows you to control access to S3 buckets via Hive by different user. It's a new "URL" parameter available when creating a Hive policy: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/hive_policy.html.
So once you add multiple S3 buckets you can control which user can read or write from which buckets.
