Member since 
    
	
		
		
		03-29-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                36
            
            
                Posts
            
        
                12
            
            
                Kudos Received
            
        
                5
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1968 | 04-08-2018 09:30 PM | |
| 1229 | 03-15-2018 08:13 PM | |
| 1801 | 03-14-2018 04:10 PM | |
| 3130 | 03-14-2018 03:48 PM | |
| 2114 | 02-19-2018 01:03 AM | 
			
    
	
		
		
		04-08-2018
	
		
		09:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 It's not elegant, but this custom masking seems to work.  cast(mask(cast(to_date({col}) as date), 'x', 'x', 'x', -1, '1', 1, 0, -1) as timestamp) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-08-2018
	
		
		12:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Is there a way of masking timestamps using the masking policies in Ranger? Perhaps through a custom UDF referenced in the policy?  The masking type of "Date: show only year" only works for dates, not timestamps.  Looking at GitHub (as the Ranger documentation isn't complete) timestamps cannot be masked, only nullified - https://github.com/myui/hive-udf-backports/tree/master/src/main/java/org/apache/hadoop/hive/ql/udf/generic  mask(value, upperChar, lowerChar, digitChar, otherChar, numberChar, dayValue, monthValue, yearValue)
Supported types: TINYINT, SMALLINT, INT, BIGINT, STRING, VARCHAR, CHAR, DATE
  Reason: some systems store dates as timestamps, so date of birth (PII) could be stored as a timestamp and I need to mask it. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Atlas
- 
						
							
		
			Apache Ranger
			
    
	
		
		
		03-15-2018
	
		
		08:16 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 If you want to change something you need to use PUT. Use POST to add new tags. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-15-2018
	
		
		08:13 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 The taxonomy is still in tech preview - have you switched it on using the below link to the documentation. Do you see a 'Taxonomy' tab in the UI?   https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_data-governance/content/atlas_enabling_taxonomy_technical_preview.html  Just a word of caution - in a large production environment switching the taxonomy on can cause Atlas performance issues. We switched it off until it comes out of tech preview.  Here's some info on how to query terms via curl. https://atlas.apache.org/0.7.1-incubating/AtlasTechnicalUserGuide.pdf I can't find any info about the terms in the latest rest api documentation.  
					 POST
http://<atlasserverhost:port>/api/atlas/v1/taxonomies/Catalog/terms
/{term_name}   I'm sure someone who knows more than I do will come along soon!  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-14-2018
	
		
		04:10 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 I think you are using the wrong API. I believe you need to use PUT /v2/types/typedefs - see https://atlas.apache.org/api/v2/ui/index.html#!/TypesREST/resource_TypesREST_updateAtlasTypeDefs_PUT  To see the current definition of have_table you can do GET http://localhost:21000/api/atlas/v2/types/typedef/name/hive_table   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-14-2018
	
		
		03:48 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Are you trying to add a description to a tag? If so you are using the wrong API. The one you are using is to add tags to entities, like hive tables.  To add a description to a tag you need to use /v2/types/typedefs (POST to add a new tag and PUT to edit an existing one )- see https://atlas.apache.org/api/v2/ui/index.html#!/TypesREST/resource_TypesREST_createAtlasTypeDefs_POST  {
  "classificationDefs":[
    {
      "createdBy": "admin",
      "name": "test_tag_name",
      "description": "Description of your tag",
      "attributeDefs": [
          {
            "name":"attribute_name_1",
            "typeName":"string",
            "isOptional":"true",
            "isUnique":"false",
            "isIndexable":"true",
            "cardinality":"SINGLE"
          },
		  {
            "name":"attribute_name_2",
            "typeName":"string",
            "isOptional":"true",
            "isUnique":"false",
            "isIndexable":"true",
            "cardinality":"SINGLE"
          },
		  {
            "name":"update_date",
            "typeName":"date",
            "isOptional":"true",
            "isUnique":"false",
            "isIndexable":"true",
            "cardinality":"SINGLE"
          }],
     "superTypes": []
    }
  ]
}
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-10-2018
	
		
		10:15 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I can only return a maximum of 100 results when doing a DSL API search in Atlas. Is this by design or a bug?  Even with a limit of 1000 only 100 items are returned  curl -k -u admin:admin -H "Content-type:application/json" -X GET https://url:port/api/atlas/v2/search/dsl?query=hive_column%20where%20__state%3D%27ACTIVE%27%20and%20qualifiedName%20like%20%27prod_%2A_data_lake%2A%27%20select%20qualifiedName%2Cname%2C__guid%20limit%201000 | python -m json.tool > hive_column_prod_data_lake_limit.json
  No limit yet still 100 items are returned. There are a lot more than 100 items - when I do the same for hive_tables only 100 items are returned.  curl -k -u admim:admin -H "Content-type:application/json" -X GET https://url:port/api/atlas/v2/earch/dsl?query=hive_column%20where%20__state%3D%27ACTIVE%27%20and%20qualifiedName%20like%20%27prod_%2A_data_lake%2A%27%20selct%20qualifiedName%2Cname%2C__guid | python -m json.tool > hive_column_prod_data_lake.json
  This is on a 2.6.3 HDP install. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Atlas
			
    
	
		
		
		03-05-2018
	
		
		08:44 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You are exactly right - thank you. Both the Atlas tag and the Ranger policy were in caps but I don't think Ranger Audit likes caps. I changed both to lower and the access is denied.  access-denied.png  Thanks so much for your help. (I've never been so happy to see an 'access denied' message!) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-05-2018
	
		
		05:31 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Madhan Neethiraj - I've changed the Ranger policy via PUT service/public/v2/api/policy/35   "conditions": [
  {
"type": "expression",
"values": ["ctx.getAttributeValue('DATA_ZONE','name').equals('data_lake')"]
}
]
<br>  But I'm still not getting the desired behaviour - I would expect holger_gov to be denied access based on the below flow:    Here's a screenshot of the Ranger Audit      Could you perhaps paste the full output of your GET service/public/v2/api/policy/{i} policy so I can compare?
Here's mine:  
{"id": 35,
"guid": "1d7a6456-840d-4d1d-b5d5-7ec37d50eb8c",
"isEnabled": true,
"createdBy": "Admin",
"updatedBy": "Admin",
"createTime": 1520122079000,
"updateTime": 1520255099000,
"version": 22,
"service": "sandbox_tag",
"name": "tenancy_food",
"policyType": 0,
"description": "",
"resourceSignature": "5b2d59d4b57c1fa990c17143d54c89974270cf8e928f982e03c89055cbc69386",
"isAuditEnabled": true,
"resources": {"tag": {"values": [  "tenancy_food"
],
"isExcludes": false,
"isRecursive": false
}
},
"policyItems": [  {"accesses": [  {"type": "hive:select",
"isAllowed": true
},
  {"type": "hive:update",
"isAllowed": true
},
  {"type": "hive:create",
"isAllowed": true
},
  {"type": "hive:drop",
"isAllowed": true
},
  {"type": "hive:alter",
"isAllowed": true
},
  {"type": "hive:index",
"isAllowed": true
},
  {"type": "hive:lock",
"isAllowed": true
},
  {"type": "hive:all",
"isAllowed": true
},
  {"type": "hive:read",
"isAllowed": true
},
  {"type": "hive:write",
"isAllowed": true
},
  {"type": "hive:repladmin",
"isAllowed": true
},
  {"type": "hive:serviceadmin",
"isAllowed": true
}
],
"users": [  "holger_gov"
],
"groups": [],
"conditions": [  {"type": "expression",
"values": [  "ctx.getAttributeValue('DATA_ZONE','name').equals('data_lake')"
],
}
],
"delegateAdmin": false
}
],
"denyPolicyItems": [],
"allowExceptions": [],
"denyExceptions": [],
"dataMaskPolicyItems": [],
"rowFilterPolicyItems": [],
}
<br> 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-05-2018
	
		
		01:01 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Madhan Neethiraj - The Ranger Audit looks to me as though only policy 35 is being used. I've attached some screen prints. I'm not sure if it's relevant, but the Ranger Audit policy when clicked doesn't show the ',', although in the actual policy the ',' is still present.                 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













