Member since
09-29-2015
36
Posts
26
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2317 | 11-07-2018 12:45 AM | |
1103 | 11-07-2018 12:22 AM | |
3236 | 03-15-2018 03:55 PM | |
4429 | 02-27-2018 09:51 PM | |
3915 | 12-29-2016 06:50 PM |
11-07-2018
12:48 AM
For HDP-3.x, HBase table name is 'atlas_janus'.
... View more
11-07-2018
12:45 AM
1 Kudo
ATLAS-2162 addressed this requirement - to add hyperlink based on attribute value. This should be available since HDP-2.6.3.
... View more
11-07-2018
12:26 AM
@meirs84 - can you please add details of the API call and the error received?
... View more
11-07-2018
12:22 AM
Atlas search APIs don't support multiple typeNames in a single call. There were few requests to support this - both for entity-types and classification-types. Atlas team will look into this; but there is no concrete date/release for this enhancement. Until then, you will have to make 2 basic-search API calls - one for hive_table and another for hive_column. As you said, full-text may not be a good choice.
... View more
03-15-2018
03:55 PM
2 Kudos
@Laura Ngo - by default the query returns maximum of 100 results. To retrieve more results, please add query parameter "limit" in the REST call, as shown below (addition of query parameter "limit=1234"): curl -k -u admim:admin -H "Content-type:application/json"-X GET https://url:port/api/atlas/v2/earch/dsl?limit=1234&query=hive_column%20where%20__state%3D%27ACTIVE%27%20and%20qualifiedName%20like%20%27prod_%2A_data_lake%2A%27%20selct%20qualifiedName%2Cname%2C__guid | python -m json.tool > hive_column_prod_data_lake.json Please note that a maximum of 10,000 results will be returned even if limit specified is higher. Both default and the max limit can be configured with the following properties: atlas.search.defaultlimit=100 atlas.search.maxlimit=10000
... View more
03-05-2018
05:55 PM
@Laura Ngo - the policy contents look right. Audit log shows the tag name in lower case - "data_zone". Please ensure that the tag name used in the condition is same as the one in Atlas.
... View more
03-05-2018
01:32 AM
@Laura Ngo - I was able to reproduce the issue. The expression you entered is indeed correct: ctx.getAttributeValue("DATA_ZONE","name").equals("data_lake") However, when the policy is saved from UI, the entered expression is broken into multiple strings, causing the evaluation to fail at runtime. This is likely an issue with Ranger UI. I was able to get around this issue by updating the policies via REST API; if possible, please update the policies via REST API. I will update on the UI issue shortly.
... View more
03-04-2018
11:11 PM
@Laura Ngo - can you verify that no other policy allowed select access on footmart database for holger_gov (please look at Ranger audit log)?
... View more
02-27-2018
09:51 PM
1 Kudo
Is it possible to reference more than one Atlas tag in one Ranger policy via the Policy Conditions? Yes. Following can be used to access details of all tags associated with the resource being accessed: ctx.getAllTagTypes() <== returns names of all tags associated with the resource (Set<String>)
ctx.getTagAttributes(tagType) <== returns attributes of given tag (Map<String, String>)
ctx.getAttributeValue(tagType, attrName) <== returns value of attribute 'attrName' in tagType tag The usecase you describe seems to require access-control based on tenancy and the zone in which the data resides. Please consider the following approach: 1. Define a classification named 'DATA_ZONE', with one attribute named "name" - as shown below: "classificationDefs": [
{
"name": "DATA_ZONE",
"attributeDefs": [
{
"name": "name",
"typeName": "string"
}
]
}
] 2. Define one classification for each tenant. In your example, you already have 2 classifications "tenancy_xxx" and "tenancy_yyy". 3. Create one tag-based policy for each tenant. Per your example, you would create 2 policies - one for "tenancy_xxx" tag and another for "tenancy_yyy" tag. 4. In policy for each tenant, you can use conditions as shown below to allow/deny access to users/groups: ctx.getAttributeValue("DATA_ZONE", "name").equals("landing") ctx.getAttributeValue("DATA_ZONE", "name").equals("staging") ctx.getAttributeValue("DATA_ZONE", "name").equals("data_lake")
... View more
03-31-2017
07:31 AM
Export/import feature helps to copy Atlas data from one instance to another. However, it won't replace the need for backup.
... View more