Member since
09-24-2015
32
Posts
60
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1026 | 02-10-2017 07:33 PM | |
1216 | 07-18-2016 02:14 PM | |
3114 | 07-14-2016 06:09 PM | |
16303 | 07-12-2016 07:59 PM |
02-10-2017
07:33 PM
1 Kudo
There are two ways to setup an existing taxonomy into Atlas; 1.) If you have Hive, then run the import-hive.sh utility. This utility scans all of the hive tables and creates the matching Atlas entries. For 2.5.3, you will find the utility at ./2.5.3.0-37/atlas/hook-bin/import-hive.sh 2.) You can also add entries and update properties using the REST API. One article describing this is available at: https://community.hortonworks.com/content/kbentry/74064/add-custom-properties-to-existing-atlas-types-in-s.html
... View more
02-06-2017
06:59 PM
4 Kudos
Overview
Atlas provides powerful
Tagging capabilities which Data Analysts to identify all data sets containing
specific types of data. The Atlas UI
itself provides a powerful Tag based search capability which require no REST
API interaction. However, for those of
you out there who need to integrate Tag based search with some of their data
discovery and governance activities, this posting is for you. Within this posting are some instructions
regarding how you can use the Atlas REST API to retrieve entity data based on a
TAG name.
Before getting too deep into the Atlas Tag search examples it is important to recognize that Atlas Tags are basically a form of an Atlas type. If you invoke the REST API command “/api/atlas/types”, in the summary output below interspersed between standard Atlas types such as ‘hive_table’, ‘jms_topic’, etc., will be the current set of user defined Atlas Tags (CUSTOMER & SALES) as shown below: "count": 35,
"requestId": "qtp1177377518-81 - c7d4a853-02a0-4a1e-9b50-f7375f6e5f08",
"results": [
"falcon_feed_replication",
"falcon_process",
"DataSet",
"falcon_feed_creation",
"file_action",
"hive_order",
"Process",
"hive_table",
"hive_db",
…
"Infrastructure",
"CUSTOMER",
"Asset",
"storm_spout",
"SALES",
"hive_column",
…
]
In the rest of the article we will expand on the Atlas types API to explore how we can perform two different types of TAG based searches. Before going too far it is important to note that the source code for the following examples are available through this repo. Tag Search Example #1: Simple REST based Tag Based Search example
In our first Tag search example our objective is to return a
list of Atlas Data Entities which have the query TAG name assigned. In this example, we are going to search our atlas
instance on (‘server1’ port 21000) for all Atlas entities with a tag named
CUSTOMER. You will want to replace
CUSTOMER with an existing tag on your system.
Our Atlas DSL query to find the CUSTOMER tag using the ‘curl’
command is as shown below: curl -iv -u admin:admin -X GET http://server1:21000/api/atlas/discovery/search/dsl?query=CUSTOMER
The example above returns a list of the entity guids which
have the Atlas Tag ‘CUSTOMER’ defined to the Atlas host ‘server1’ on port
21000. To run this query on your own
cluster or on a sandbox just substitute the Atlas Server Host URL, Atlas Server
Port number, login information and your Tag name and then invoke as shown above
with curl (or SimpleAtlasTagSearch.py in the Python example in the referenced
Repo at the end of this article).
An output from this REST API query on my cluster is shown
below: {
"count":
2,
"dataType": {
"attributeDefinitions": [
…
],
"typeDescription": null,
"typeName": "__tempQueryResultStruct120"
},
"query":
"CUSTOMER",
"queryType": "dsl",
"requestId": "qtp1177377518-81 -
624fc6b9-e3cc-4ab7-80ba-c6a57d6ef3fd",
"results":
[
{
"$typeName$": "__tempQueryResultStruct120",
"instanceInfo": {
"$typeName$": "__IdType",
"guid": "806362dc-0709-47ca-af16-fac81184c130",
"state": "ACTIVE",
"typeName":
"hive_table"
},
"traitDetails": null
},
{
"$typeName$": "__tempQueryResultStruct120",
"instanceInfo": {
"$typeName$": "__IdType",
"guid":
"4138c963-b20d-4d10-b338-2c334202af43",
"state": "ACTIVE",
"typeName": "hive_table"
},
"traitDetails": null
}
]
}
The results from this query can be thought of having 3
sections:
results header where you can find the results
count
Returned DataTypes
Results (list of entity guids)
For our purposes we are really only interested in the list
of entities, so all you need to do is focus on extracting the important
information from the .results jsonpath object in the return json object. Looking at the results section we observe
that only one entity has the CUSTOMER tag assigned. This entity located by the search has the
guid assigned of ‘4138c963-b20d-4d10-b338-2c334202af43’ we see is an active
entity (not deleted). We can now use the
entity search capabilities to retrieve the actual entity as described in the
next example within this article. Example #2: Returning details on all entities based on Tag assignment
The beauty of Example #1 is we can build an entity list
using a single REST API call. However,
for the real world we will want access to details about the assigned
entities. To accomplish this, we will
need a programming interface such as Python, Java, Scala, bash what your
favorite tool is, etc. to pull the GUIDs and then perform entity searches.
For the purposes of this posting, we will use Python to
illustrate how to perform more powerful Atlas Tag searches. The example below performs two Atlas REST API
queries to build a json object containing the details and not just guids for
the entities with our Tag assigned. def atlasGET( restAPI ) :<br>
url = "http://"+ATLAS_DOMAIN+":"+ATLAS_PORT+restAPI<br>
r= requests.get(url, auth=("admin", "admin"))return(json.loads(r.text));
results = atlasGET("/api/atlas/discovery/search/dsl?query={0}".format(TAG_NAME))
entityGuidList = results['results']
entityList = [] for entity in entityGuidList:
guid = entity['instanceInfo']['guid']
entityDetail = atlasGET("/api/atlas/entities/{0}".format(guid))
entityList.append(entityDetail);
print json.dumps(entityList, indent=4, sort_keys=True)
The output from this script is now available for more
sophisticated data governance and data discovery projects. Atlas Tag Based Search Limitations
As powerful as both the Atlas UI and Atlas REST API Tag
based searches are, there are some limitations to be aware:
Atlas supports only searching on one TAG at a
time.
It is impossible to include other entity
properties in the TAG searches
The Atlas REST API used for TAG searches can
only return a list of GUIDs.
It is not possible to search for TAG attributes
... View more
Labels:
01-17-2017
04:43 PM
Repo Description Overview This project contains examples for how to manage Apache Atlas Tag Searches using the REST API. The project is produced using Python 2.7. Though you should be able to analyze the code and apply it for your preferred REST API query tool. Requirements Python 2.7 (I have only tested it on 2.7 running on both MacOS and Centos 7) Python packages: requests json Running the example Modify the properties at the top of the SimpleAtlasTagSearch.py file run the following command: python SimpleAtlasTagSearch.py If successful you will see a list of all entities assigned to TAG_NAME specified in the properties at the top of the program. Repo Info Github Repo URL https://github.com/mfjohnson/AtlasTagSearches.git Github account name mfjohnson Repo name AtlasTagSearches.git
... View more
Labels:
12-23-2016
05:01 PM
Overview Data Governance is unique for each organization and every
organization needs to track a different set of properties for their data
assets. Fortunately, Atlas provides the
flexibility to add new data asset properties to support your organization’s
data governance requirements. The objective
for this article is to describe the steps utilizing the Atlas REST API to add
new Atlas properties to your Atlas Types. Add a new Property for an existing Atlas Type To simplify this article, we will focus in on the 3 steps
required to add and enable for display a custom property to the standard Atlas
property ‘hive_table’. Following these
steps, you should be able to modify the ‘hive_table’ Atlas Type and add custom
properties which are available to enter values, view in the Atlas UI and search. To make the article easier to read the JSON file is shown in
small chunks. To view the full JSON file
as well as other files used to research for this article, check out this repo. Step 1: Define the custom property JSON The most import step of this process is properly defining
the JSON used to update your Atlas Type.
There are three parts to the JSON object we will pass to Atlas; The header – contains the type identifier and
some other meta information required by Atlas The actual new property definition The required existing Atlas type properties Defining the Header Frankly, the header is just standard JSON elements which get
repeated every time you define a new property.
The only change we need to make to the header block shown below for each
example is to get the ‘typeName’ JSON element properly set. In our case as shown below we want to add a
property defined for all Hive tables so we have correctly defined the typeName
to be ‘hive_table’. {"enumTypes": [],"structTypes": [],"traitTypes": [],"classTypes": [
{"superTypes": ["DataSet"],"hierarchicalMetaTypeName": "org.apache.atlas.typesystem.types.ClassType","typeName": "hive_table","typeDescription": null, Keep in mind that all the JSON elements shown above pertain
to the Atlas type which we plan to modify. Define the new Atlas Property For this example, we are adding a property called ‘DataOwner’
which we intend to contain the owner of the data from a governance
perspective. For our purposes, we have
the following requirements:
Requirement
Attribute Property
Assignment
The property is searchable
isIndexable
True
The property will contain a string
datatype
String
Not all Hive tables will have an owner
Multiplicity
Optional
A Data owner can be assigned to multiple Hive tables
isUnique
false
Based on the above requirements, we end up with a property
definition as shown below: {"name": "DataOwner","dataTypeName": "string","multiplicity": "optional","isComposite": false,"isUnique": false,"isIndexable": true,"reverseAttributeName": null}, When defining Atlas properties, you can as shown in the file, it is possible to define multiple properties at one time, so take your
time and try and define all of the properties at once. Make certain you include an existing Properties An annoying thing about the Atlas v1 REST api is the need to
include some of the other key properties in your JSON file. For this example, which was running on HDP
2.5.3 I had to define a bunch of properties.
And every time you add a new custom property it is necessary to include
those custom properties in your JSON. If
you check out the file JSON file used for this example you will find a
long list of properties which are required as of HDP 2.5.0. Step 2: PUT the Atlas property update We now have the full JSON request constructed with
our new property requirements. So it is
time to PUT the JSON file using the ATLAS REST API v1. For the text of this article I am using ‘curl’
to make the example clearer, though for the associated repo python is
used to make life a little easier. To execute the PUT REST request we will first need to
collect the following data elements:
Data Element
Where to find it
Atlas Admin User Id
This is a defined ‘administrative’ user for the Atlas
System. It is the same user id which
you use to log into Atlas.
Atlas Password
The password associated with Atlas Admin User Id
Atlas Server
The Atlas Metadata Server.
This can be found by selecting the Atlas server from Ambari and then
looking in the summary tab.
Atlas Port
It is normally 21000.
Check the Ambari Atlas configs for the specific port in your cluster
Update_hive_table_type.json
This is the name of the JSON file containing our new Atlas
property definition
curl -ivH -d @update_hive_table_type.json
--header "Content-Type: application/json" -u {Atlas Admin User Id}:{Atlas Password} -X PUT http://{Atlas Server}:{Atlas Port}/api/atlas/types If all is successful, then we should see a result like that
which is shown below. The only thing you
will need to verify in the result (other than the lack of any reported errors)
is that then “name” element is the same as the Atlas type to which you are adding
a new custom property. {
"requestId": "qtp1177377518-235-fcf1c6f4-5993-49ac-8f5b-cdaafd01f2c0",
"types":
[ {
"name": "hive_table"
} ]} However, if you are like me, then you probably will make a
couple of mistakes along the way. To
help you identify root cause for your errors, here is a short list of errors
and how to resolve them: Error #1: Missing a necessary Atlas property for the Type An error encountered like shown below is because your JSON
with the new custom property is missing an existing property. { "error":
"hive_table can't be updated - Old Attribute stats:numRows is
missing",
"stackTrace":
"org.apache.atlas.typesystem.types.TypeUpdateException: hive_table can't
be updated - Old Attribute stats:numRows is missing\n\tat The solution to fix this problem is to add that property along
with your custom property in your JSON file.
If you are uncertain as to the exact definition for the property, then
execute the execute Atlas REST API GET call as shown below to list out the
Atlas Type you are currently modifying properties: curl -H –u
{Atlas Admin User id}:{Atlas password}-X GET http://{Atlas
Server}/api/atlas/types Error #2: Unknown datatype: An error occurred like the one below: { "error":
"Unknown datatype: XRAY",
"stackTrace":
"org.apache.atlas.typesystem.exception.TypeNotFoundException: Unknown In this case, you have entered an incorrect Atlas Data Type. The allowed for data types include: byte short int long float double biginteger bigdecimal date string {custom types} The {custom types} enables you to reference another Atlas
type. So for example you decide to
create a ‘SecurityRules’ Atlas data type which itself contains a list of
properties, you would just insert the SecurityRules type name as the property. Error #n: Added incorrectly a new Atlas property for a type and you need to
delete it This is the reason why you ALWAYS want to modify Atlas Types
and Properties in a Sandbox developer region.
DO NOT EXPERIMENT WITH CUSTOMING ATLAS TYPES IN PRODUCTION!!!!! If you
ignore this standard approach in most organizations SLDC, your solution is to
delete the Atlas Service from within Ambari, re-add the service and then re-add
all your data. Not fun. Step 3: Check out the results As we see above, our new custom Atlas ‘hive_table’ property
is now visible in the Atlas UI for all tables.
As the property was just defined for all ‘hive_table’ data assets the
value is null. Your next step which is
covered in the Article Modify Atlas Entity properties using REST API commands is to assign a value the new property. Bibliography Atlas Rest API Atlas Technical User Guide Atlas REST API Search Techniques Modify Atlas Entity properties using REST API commands
... View more
Labels:
12-23-2016
04:56 PM
1 Kudo
Repo Description Support your organizations Data Governance initiative
through the definition of Atlas Properties for any type and enable custom
governance data tracking. Repo Info Github Repo URL https://github.com/mfjohnson/AtlasModifyProperties.git Github account name mfjohnson Repo name AtlasModifyProperties.git
... View more
Labels:
10-26-2016
07:44 AM
9 Kudos
Overview The whole purpose of
that list is to be able to search through all the entities contained within
your data lake and identify specifically those entities which will lead your
analysis. A Common complaint with the idea of the data lake is that you load
too many files into the data lake and it becomes unwieldy. Atlas addresses this
problem by providing powerful search tools to identify all of the Data entities
located within the data lake. The purpose of this article is to explore the
four Entity search techniques available within Atlas. Atlas search options compared The following chart
summarizes the capabilities for each of the four atlas entity search techniques
explored in this article. The sub sections below in this article we’ll go into
more detail on how to set up the search techniques.
Attribute
Entity Search
Qualified Name
Search
DSL Search
Full Text Search
Can identify multiple entities
Yes
No
Yes
Yes
Supports filtering
No
Yes
Yes
Yes
Free text
No
No
No
Yes
Ability to search on sub-attributes
No
No
Yes
No
Primary Value
Listing all Entities of a given type
Retrieve a specific entity record
Complex searches of Entities
Locating records primarily based on name, Comment and
description fields
To simplify this article we are going to focus on just the ‘hive_table’
type, though keep in mind that for all the search examples covered here, you
can use any Atlas type. Preparations for the Search Examples To support the search
examples first were going to need to gather the following pieces of information:
Configuration Property
Description
Where to find it
Ranger user id
The user id used to log into the Ranger UI screen. This login will be for a Ranger
Administrative user with a default id of ‘admin’
Password
The matching password to the Ranger User id. For this article I will be using the Ranger
default password ‘admin’
Ranger Admin Server
The Administrative server for Ranger. For this article, I will be using the FQDN ‘server1.hdp’
and also assuming the default port 21000.
Go into Ambari, select the Ranger than QuickLinks. The host name for the Ranger UI is the
proper value for this parameter
GUID
A unique Entity identifier.
Found in many of this article’s search result responses.
Fully Qualified Table Name
A unique identifier for Hive tables.
Is created by concatenating the following: {database
name}.{table name}@{cluster Name}. So,
if you had a database called ‘transports’ and a table named ‘drivers’ on a
cluster named HDP, then the fully qualified table name would be ‘transports.drivers@HDP’.
For all examples contained within this article, to avoid
creating overly long result listings, you will often see a “…” marker in the
middle of a list. When you see this know
that my intent was to spare you excessive scrolling when reviewing the result
set. All of the examples are presented in this article using the
curl command as that is commonly available in Unix (Linux & Mac) based systems. For Python examples, you can find more
detailed examples through this articles matching GitHub
repository. Atlas Entity Search Example The Atlas Entity
Search technique is the simplest of all of those explored in this article. It’s entire purpose is to retrieve all Entities
of the specified type with no additional filtering enabled. To retrieve a JSON list containing all the entities you will
use the REST API command: curl -iv -u {Ranger userId:Password} -X GET http://{Ranger
Admin Server}:21000/api/atlas/entities?type=hive_table Now let’s take a look at an actual example: curl -iv -u admin:admin -X GET http://server1:21000/api/atlas/entities?type=hive_table In this example, we see as shown below the response contains
two parts; (1) A header set containing an element count and RequestID, and (2)
A list of GUIDs associated with all the Hive tables known to Atlas. { "count": 26,
"requestId": "qtp1783047508-5870 - 867273a6-10e1-4c65-8da2-08a06b89d005",
"results": [
"d3d637c5-df7e-4311-adc3-bcc6c4b81fb1",
"cdbbd999-f789-4d2e-9127-b2443209b3b7",
"848c05fa-f2d9-4482-8892-d4b4fc137ee6",
…
"03e38f24-577a-45ae-b67f-54a3e34f34ce",
"4945f76b-6403-483a-b9aa-3161ce3e4bd6",
"90d76c28-911e-425b-845f-5e1096eed3bb",
"9571fb0e-52f5-4c16-a8f1-d2cf5138824c" ],
"typeName": "hive_table"
} Notice in the example above we find that there are 26 hive
tables. Now unfortunately this sort of output by itself is not useful, so this
API should be considered as one component in the entity retrieval. To actually see the entity full details you
would want to run the REST call below using one of the GUID from the above response: curl -iv -u admin:admin -X GET http://{Ranger
Admin Server}:21000/api/atlas/entities/{GUID} Atlas Qualified Name Search Often we will only
want to look at one end of the record. In these cases, we can use the atlas
qualified names search to retrieve a single entity Record. In Atlas for Hive
tables the qualified name represents the concatenation of the database name the
table name and finally the cluster team. curl -iv -u {Ranger userId:Password} -X GET http://{Ranger
Admin Server}::21000/api/atlas/entities?type=hive_table&property=qualifiedName&value={Fully
Qualified Table Name} A result set from this query would cover all of the metadata
captured by Atlas for the Fully Qualified Table Name as shown below (sorry
showing it all so you will have to scroll): {
"definition": {
"id": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [
"TLC" ],
"traits": {
"TLC": {
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Struct",
"typeName": "TLC",
"values": {} } },
"typeName": "hive_table",
"values": {
"aliases": null, "columns":
[ {
"id": {
"id": "1690ccc2-d7be-45af-becb-c6b360a1a30f",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null, "name":
"driverid",
"owner": "hive",
"qualifiedName": "default.drivers.driverid@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0 },
"type": "varchar(15)" } }, {
"id": {
"id": "249a7ce3-6b19-418e-9094-7d8a30bc596f",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [
"CARRIER" ],
"traits": {
"CARRIER": {
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct",
"typeName": "CARRIER",
"values": {}
} },
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "companyid",
"owner": "hive",
"qualifiedName": "default.drivers.companyid@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(15)" } }, {
"id": {
"id": "d3b9557a-5ad0-4585-a9af-e1fed24569fc",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null, "name":
"customer",
"owner": "hive",
"qualifiedName": "default.drivers.customer@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(40)" } }, {
"id": {
"id": "143479a3-be79-4f04-b649-4a09b5429ace",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "drivername",
"owner": "hive",
"qualifiedName": "default.drivers.drivername@HDP", "table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(75)" } }, {
"id": {
"id": "6c3123a9-0d09-490b-840d-6cc012ab69e0",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE", "typeName":
"hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [], "traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "yearsdriving", "owner": "hive",
"qualifiedName": "default.drivers.yearsdriving@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "int" } }, {
"id": {
"id": "a419ed9f-df56-41cc-90bc-1c00a4d3c428",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "riskscore",
"owner": "hive",
"qualifiedName": "default.drivers.riskscore@HDP", "table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(5)" } } ],
"comment": null,
"createTime": "2016-10-11T17:11:11.000Z",
"db": {
"id": "332189cc-d994-44c2-8f87-29a28a471434",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_db",
"version": 0 },
"description": "\"I get my answers from
HCC\"",
"lastAccessTime": "2016-10-11T17:11:11.000Z", "name": "drivers",
"owner": "hive",
"parameters": {
"COLUMN_STATS_ACCURATE":
"{\"BASIC_STATS\":\"true\"}",
"EXTERNAL": "TRUE",
"numFiles": "1",
"numRows": "4278",
"rawDataSize": "1967880",
"totalSize": "68597",
"transient_lastDdlTime": "1476205880" },
"partitionKeys": null,
"qualifiedName": "default.drivers@HDP", "retention":
0,
"sd": {
"id": {
"id": "36166469-1014-4645-98a6-9df34b37a145",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_storagedesc",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference", "traitNames": [],
"traits": {},
"typeName": "hive_storagedesc",
"values": {
"bucketCols": null,
"compressed": false,
"inputFormat": "org.apache.hadoop.hive.ql.io.orc.OrcInputFormat",
"location":
"hdfs://server1.hdp:8020/apps/hive/warehouse/drivers",
"numBuckets": -1,
"outputFormat":
"org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat",
"parameters": null,
"qualifiedName": "default.drivers@HDP_storage",
"serdeInfo": {
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct", "typeName":
"hive_serde",
"values": {
"name": null,
"parameters": {
"serialization.format": "1"
},
"serializationLib":
"org.apache.hadoop.hive.ql.io.orc.OrcSerde"
} },
"sortCols": null,
"storedAsSubDirectories": false,
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE", "typeName":
"hive_table",
"version": 0 } } },
"tableType": "EXTERNAL_TABLE",
"temporary": false,
"viewExpandedText": null, "viewOriginalText": null } },
"requestId": "qtp1783047508-5870 -
4f0cc268-7fdc-49ac-a585-74f486ee3786" You thing you will immediately notice is the output, like
most JSON outputs is hierarchical. For
qualified name searches as well as most of the others, it is only possible to
search on the top levels. The one
exception to this are Atlas Entity DSL searches which are covered in the next
section. Atlas Entity DSL search Example The Atlas Entity DSL search is by far the most powerful of
the options covered in this article. The DSL search enables selection of Entities
based on a combination of properties, Atlas types, as well as sub properties within
the entity definition. In addition, the DSL search allows to select only those
properties desired for display so you don’t have to output monstrous documents. DSL Search Example #1: SIMPLE DSL SEARCH BASED ON TABLENAME ONLY This example is similar to the fully qualified table name
example described earlier in this article. The primary difference, as shown
below is the ability to search only on the table name and allow the data base
and cluster name to stay as wildcards. curl -iv -u {Ranger userId:Password} -X GET http://{Ranger
Admin Server}:21000/api/atlas/discovery/search/dsl?query=hive_table+where+name='{TableName}’ To make the query above work it must be properly
encoded. As shown below the ‘+’
character is used as the spacing between keywords. { "count":
7,
"dataType": {
"attributeDefinitions": [ {
"dataTypeName": "hive_db",
"isComposite": false,
"isIndexable": false, "isUnique":
false,
"multiplicity": {
"isUnique": false,
"lower": 1,
"upper": 1 },
"name": "db",
"reverseAttributeName": null }, … }, "query":
"hive_table where name='drivers'",
"queryType": "dsl",
"requestId": "qtp1783047508-19 -
79fcb168-d047-4b70-9129-adebba09b323",
"results": [ { … } In the sample output above, we see that the sample query
identified seven entities where are name is equal to tablename and the atlas
type is a hive table. In addition, in the result header we see a list of each
of the attributes output as part of the result set. Starting with the JSON
element “query”, we see the query and the query type used. And then at the very
end of the results set is a section titled called results which will output the
detail for each entity found much as we saw earlier in the qualified cable name
search in one of the sections above. DLS Search Example #2: Limiting the number of rows output There are several DSL
search options which control things such as row limit, order by among others.
This section Will show how they can be used through the use of the row limit
option as shown below: curl -iv -u {Ranger userId:Password} -X GET http://{Ranger
Admin Server}:21000/api/atlas/discovery/search/dsl?query=hive_table+where+name='drivers'
limit 3 In the example above, we see the account of entities founded
and returned is equal to three which exactly matches the limit specified in the
query shown above. The contents of the attribute definitions as well as the
resulting Entity properties is the same as the other DSL searches covered in
this article. { "count":
3,
"dataType": {
"attributeDefinitions": [ {
"dataTypeName": "hive_db", "isComposite": false,
"isIndexable": false,
"isUnique": false,
"multiplicity": {
"isUnique": false,
"lower": 1,
"upper": 1 },
"name": "db",
"reverseAttributeName": null },… DSL Search Example #3: DSL SEARCH BASED ON TABLENAME ONLY DEMONSTRATING
DISPLAYING ONLY SELECT PROPERTIES As the prior examples
illustrate, The DSL search options can how put a lot of data, but much of it useless for your analyses. To constrain the
types of properties returned the atlas DSL search has the ‘select’ keyword to
limit the properties actually output. The following example illustrates how to
invoke the ‘select’ option. In the example below, the output is constrained to just the
properties “tableType, temporary, retention, qualifiedName, description, name, owner,
comment, createTime” and reduces the
response document output size: curl -iv -u admin:admin -X GET
http://server1:21000/api/atlas/discovery/search/dsl?query=hive_table+where+name='drivers'
select
tableType,temporary,retention,qualifiedName,description,name,owner,comment,createTime
limit 1 The result from the query above is shown in the listing
below: { "count":
1,
"dataType": {
"attributeDefinitions": [ {
"dataTypeName": "string",
"isComposite": false,
"isIndexable": false,
"isUnique": false,
"multiplicity": { "isUnique": false,
"lower": 0,
"upper": 1 },
"name": "tableType",
"reverseAttributeName": null }, … }, "query":
"hive_table where name='drivers' select tableType,temporary,retention,qualifiedName,description,name,owner,comment,createTime
limit 1",
"queryType": "dsl",
"requestId": "qtp1783047508-21 -
347d7796-cce3-4eb6-8a40-59cedc07433a",
"results": [ {
"$typeName$": "__tempQueryResultStruct48",
"comment": null,
"createTime": "2016-10-11T16:50:03.000Z",
"description": "\"Try this value again\"",
"name": "drivers",
"owner": "hive",
"qualifiedName": "default.drivers@HDP",
"retention": 0,
"tableType": "EXTERNAL_TABLE",
"temporary": false } ]} In the example query above, only one entity was identified
and as you can see in the results block only those property names specified
after the select option art display rather than the long hierarchically deep
output we’ve seen in prior examples. DSL Search Example #4: DSL SEARCH FOR ENTITIES CONTAINING COLUMN NAME As nice as limiting
output to just the first-order fields may seem, often it is necessary to query
on some arrays of sub-properties in the entity definition. One complex property often needed in data
discovery queries in the ‘column’ property.
If you want to search for all hive_tables where a specific field resides,
possibly one used often as a foreign key, then you should check out this
example: curl -iv -u admin:admin -X GET http://server1:21000/api/atlas/discovery/search/dsl?query=hive_table%2C+columns+where+name%3D%27tweet_id%27 In the DSL query above, you will see the encoded string %2C
(‘,’) following after the data type.
That comma is indicating that we want to look at the values assigned to
the property name immediately following…which in this case is the property
named ‘columns’. Specifically, the
example above is requesting all Hive tables in which there exists a column
named ‘tweet_id’. The use of the ’+’ to
separate keywords is just as it has been for the other dsl queries, however,
for the query of the sub-property it must be entirely encoded as you can see in
the example above. The result from the above example is: { "count":
1,
"dataType": {
"attributeDefinitions": [ {… ], "hierarchicalMetaTypeName":
"org.apache.atlas.typesystem.types.ClassType",
"superTypes": [
"DataSet" ],
"typeDescription": null,
"typeName": "hive_column" }, "query":
"hive_table, columns where name='tweet_id'",
"queryType": "dsl",
"requestId": "qtp1783047508-5868 -
3db60f55-8e7b-409b-9a6f-2abea20b8371",
"results": [ {
"$id$": {
"$typeName$": "hive_column",
"id": "b32fc0ab-2f66-4ad1-9728-ccd1d48dbf32",
"state": "ACTIVE",
"version": 0 },… As we see in the output above, only 1 entity was found with
a columname equal to ‘tweet_id’. You
will also note in the example above that the output is the exact same for all
of the hive_table types covered so far in this article. If you wanted to you could make this query
more powerful to return a limit of values and only include a select set of properties
in the results output. Atlas Full Text Search Example Our last type of query will free search the top level entity
properties for the query string passed. The
example below for example, is looking for any top level property which contains
the string ‘sku’. curl –iv -u admin:admin -X GET http://server1:21000/api/atlas/discovery/search/fulltext?query=sku The search results for this query option are similar to the
entity list option explored at the start of this article in that it only lists
the entitys’ GUID where the query is equal to the text supplied (‘sku’ in this example). One other interesting output property is the
score. The ‘score’ property is an
attempt to evaluate the search output quality as you find in many search
engines. The output below shows a sample
response from the query above: { "count":
4, "query":
"sku",
"queryType": "full-text",
"requestId": "qtp1783047508-5868 -
5270285e-5bf4-43e5-82ed-46dc8901c461",
"results": [ {
"guid": "d37cfeab-afbc-41d0-8dda-71da29043bc3",
"score": 0.7805047, "typeName":
"hive_process" }, {
"guid": "b9b7dae2-775b-4c95-82b6-9d0ab2097a2c",
"score": 0.65042055,
"typeName": "hive_process" }, {
"guid": "c22f07fd-7522-4f37-97eb-96e3a9bc16bc",
"score": 0.6008328,
"typeName": "hive_column" }, {
"guid": "f23097c3-d836-4543-99d6-b08f6cdcc97a",
"score": 0.6008328,
"typeName": "hive_column" } ]} Bibliography:
Atlas Search
Examples in Python Atlas REST Search API
... View more
Labels:
10-26-2016
01:12 AM
The article: Modify Atlas Entity properties using REST API commands contains a full description for how to update both the comment and description entity properties for Atlas managed hive_table types.
... View more
10-24-2016
01:56 PM
3 Kudos
Repo Description A collection of Python based Atlas Entity search examples. These search examples use the REST API /api/atlas/discovery/search and /api/atlas Repo Info Github Repo URL https://github.com/mfjohnson/AtlasSearchExamples.git Github account name mfjohnson Repo name AtlasSearchExamples.git
... View more
Labels:
10-19-2016
01:22 PM
4 Kudos
Overview
This article reviews the steps necessary to update Hive entities within Atlas the Description and Comment fields. The 0.70 Atlas release will display and allow text searches on the ‘‘description’ field, but the Atlas UI does not at this time support the ability to manually enter those properties into a given data Asset.
Examined in this article includes:
Searching for a Hive_Table entity
Update a single property for the Hive_Table entity definition (“description”) The Problem:
In release 0.70, Atlas has the ability to monitor additions
as well as changes to Hive table and Hive columns. When Atlas identifies a new entry or change
the appropriate Metadata property is
updated for that entity. One very cool
aspect to Atlas is the ability to conduct either DSL or free text searches on
any properties set for the entity.
Anyone trying to identify datasets to support a specific analytic
activity will definitely appreciate the ability search through all of the
entities and quickly discover valuable data assets in the data lake without having
to relying on tribal knowledge.
For this Article we will update a specific table based on
its full qualified name and then assign a new description field to the table. The full source code for the examples covered
in this article on
GitHub. The code for this example is written in Python and there is a full set of instructions in the repository README.md file. Locating the Entity whose properties require updating
Now let’s assume that in our ‘HDP’ cluster within the ‘default’
database there exists a table named ‘drivers’.
For this table, our objective is to change the ‘description’ property
from its current value to a value of ‘I get my answers from HCC’. Entity property updates are made one at a
time, so our first step is to collect the Guid for our target table.
As this article is about the update of a property within an Hive_table Entity, we will limit the
search coverage to identifying a unique Hive_table. The query values for this example are:
Property
Value used in this article
Comments on how to change the provided values for your cluster.
Atas server FQDN
server1.hdp
Use your server's Atlas Metadataserver FQDN
entityType
hive_table
Can be any valid Atlas Type
database name
default
Specify your table's database name.
table name
drivers
This can be any Hive Table whose metadata is already in Atlas. The table name you provide must already exist on your specified cluster.
Cluster name
HDP
The name of your cluster
An Atlas entity can be any variety of types. The beauty of this architecture is the same
search steps are available whether seeking a table, a hive column, or some
other Atlas managed type. The format we
will use for this search example is:
HTTP://{Atlas server FQDN}:21000/api/atlas/entities?type={entitytype}&property=qualifiedName&value={databasename}.{table name}@{Cluster name}
So for our example, the exact REST query would be:
http://server1.hdp:21000/api/atlas/entities?type=hive_table&property=qualifiedName&value=default.drivers@HDP
The full result as shown below from this REST query will contain
the guid necessary for the update along with all of the hive_table’s metadata
information as shown below:
{
"definition": {
"id": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [
"TLC" ],
"traits": { "TLC": {
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct",
"typeName": "TLC",
"values": {} } },
"typeName": "hive_table", "values":
{
"aliases": null,
"columns": [ {
"id": {
"id": "1690ccc2-d7be-45af-becb-c6b360a1a30f",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "driverid",
"owner": "hive",
"qualifiedName": "default.drivers.driverid@HDP", "table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(15)" } }, {
"id": {
"id": "249a7ce3-6b19-418e-9094-7d8a30bc596f",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE", "typeName":
"hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [ "CARRIER" ],
"traits": {
"CARRIER": {
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct",
"typeName": "CARRIER",
"values": {}
} },
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "companyid",
"owner": "hive",
"qualifiedName": "default.drivers.companyid@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(15)" } }, {
"id": {
"id": "d3b9557a-5ad0-4585-a9af-e1fed24569fc",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column", "values": {
"comment": null,
"description": null,
"name": "customer",
"owner": "hive",
"qualifiedName": "default.drivers.customer@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(40)" } }, {
"id": {
"id": "143479a3-be79-4f04-b649-4a09b5429ace",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null,
"description": null,
"name": "drivername",
"owner": "hive",
"qualifiedName": "default.drivers.drivername@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(75)" } }, {
"id": {
"id": "6c3123a9-0d09-490b-840d-6cc012ab69e0",
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_column",
"values": {
"comment": null, "description": null,
"name": "yearsdriving",
"owner": "hive",
"qualifiedName": "default.drivers.yearsdriving@HDP",
"table": { "id":
"b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "int" } }, {
"id": {
"id": "a419ed9f-df56-41cc-90bc-1c00a4d3c428",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_column",
"version": 0 },
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {}, "typeName":
"hive_column",
"values": {
"comment": null,
"description": null,
"name": "riskscore",
"owner": "hive", "qualifiedName":
"default.drivers.riskscore@HDP",
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0
},
"type": "varchar(5)" } } ],
"comment": null,
"createTime": "2016-10-11T17:11:11.000Z",
"db": {
"id": "332189cc-d994-44c2-8f87-29a28a471434",
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_db",
"version": 0 }, "description":
"\"changeMe\"",
"lastAccessTime": "2016-10-11T17:11:11.000Z",
"name": "drivers",
"owner": "hive",
"parameters": {
"COLUMN_STATS_ACCURATE":
"{\"BASIC_STATS\":\"true\"}",
"EXTERNAL": "TRUE",
"numFiles": "1",
"numRows": "4278",
"rawDataSize": "1967880",
"totalSize": "68597",
"transient_lastDdlTime": "1476205880" },
"partitionKeys": null,
"qualifiedName": "default.drivers@HDP",
"retention": 0,
"sd": {
"id": {
"id": "36166469-1014-4645-98a6-9df34b37a145",
"jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_storagedesc",
"version": 0 }, "jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"traitNames": [],
"traits": {},
"typeName": "hive_storagedesc",
"values": {
"bucketCols": null,
"compressed": false,
"inputFormat":
"org.apache.hadoop.hive.ql.io.orc.OrcInputFormat",
"location":
"hdfs://server1.hdp:8020/apps/hive/warehouse/drivers",
"numBuckets": -1,
"outputFormat":
"org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat",
"parameters": null,
"qualifiedName": "default.drivers@HDP_storage",
"serdeInfo": { "jsonClass":
"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct",
"typeName": "hive_serde",
"values": {
"name": null,
"parameters": {
"serialization.format": "1"
},
"serializationLib":
"org.apache.hadoop.hive.ql.io.orc.OrcSerde"
} },
"sortCols": null,
"storedAsSubDirectories": false,
"table": {
"id": "b78b5541-a205-4f9e-8b81-e20632a88ad5",
"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"state": "ACTIVE",
"typeName": "hive_table",
"version": 0 } } },
"tableType": "EXTERNAL_TABLE",
"temporary": false,
"viewExpandedText": null,
"viewOriginalText": null } },
"requestId": "qtp511473681-34831 -
b088be5b-44e6-4a2c-bd4a-7beeb059cf4f"}
In the result set above, locate the "id" property value which is the GUID and the "description" property with the current value of "changeMe".
In this case we will use the REST query results definition.id.id
value of ‘b78b5541-a205-4f9e-8b81-e20632a88ad5’ to support our next REST query
to update the property value. We can
also see in the ‘description’ field which is highlighted in bold currently has
the value of “changeMe”. Updating an Entities Property value
Now that we have the GUID, it is time to update the ‘description’
property from ‘changeMe’ to ‘I get my answers from HCC’. The update entity property REST command requires the GUID from the prior search step. To update the property, we will use the POST entity Atlas REST Command rolling the url query format and include the string "I get my answers from HCC" in the post message payload: http://{Atlas
server FQDN}:21000/api/atlas/entities/{GUID from prior search operation}?property={atlas
property field name}
So to finish our example, with our payload containing the string "I get my answers from HCC", the actual query would be:
http://server1:21000/api/atlas/entities/b78b5541-a205-4f9e-8b81-e20632a88ad5?property=description
The result from the above command will be the current
Metadata definition for our drivers table in JSON format as shown below:
{…
"description": "\"I get my answers from
HCC\"",
"lastAccessTime": "2016-10-11T17:11:11.000Z",
"name": "drivers",
"owner": "hive",
"parameters": {
"COLUMN_STATS_ACCURATE":
"{\"BASIC_STATS\":\"true\"}",
"EXTERNAL": "TRUE",
"numFiles": "1",
"numRows": "4278",
"rawDataSize": "1967880",
"totalSize": "68597",
"transient_lastDdlTime": "1476205880"}
Now let's go take a look at the Atlas UI, and check on the description for the drivers table. As we see in the screen print below, the new description property value has been successfully changed:
Next Steps:
This article attempts to take a simple property change example to illustrate the techniques necessary to modify the Atlas Metadata for a given entity. After you have completely run through this example, so follow on activities to experiment with include:
Changing properties for different entity types such as Hive_column or any of the HBase types.
Attempt to change some of the other top level property fields.
Go through a list of Hive tables changing each 'description' property with the value from an external source. Resource Bibliography
Atlas Entity REST API:
Atlas Search grammar:
... View more
Labels:
10-18-2016
12:32 AM
2 Kudos
Repo Description This code example reviews the steps necessary to update Hive
entities within Atlas the Description and Comment fields. The 0.70 Atlas release will display and allow
text searches on the ‘comments’ and ‘description’ field, but the Atlas UI does
not at this time support the ability to manually enter those properties into a
given data Asset. Examined in this example includes:
Searching for a Hive_Table entities Updating a single property for the Hive_Table
entity definition (“description”) Repo Info Github Repo URL https://github.com/mfjohnson/UpdateHiveEntityProperties.git Github account name mfjohnson Repo name UpdateHiveEntityProperties.git
... View more
Labels: