Member since
01-31-2016
96
Posts
92
Kudos Received
20
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1617 | 02-11-2019 01:04 PM | |
1713 | 12-06-2018 01:19 PM | |
1001 | 08-23-2018 06:22 AM | |
980 | 08-09-2018 11:29 AM | |
1237 | 03-29-2018 04:55 PM |
09-12-2022
05:18 AM
How do we check if Hive is running in Http mode? I tried to search the same in hive-site.xml and found that http.mode <binary> be default. In this case , can you please help me with this issue?
... View more
05-07-2020
01:03 PM
Does this mean that I solr CLI is not working or has to something with solr.
... View more
12-21-2019
06:33 AM
1 Kudo
You need to find the parent that has the association and delete the references. Example: I have a hive table with a column that is SSN and it had two tags associated to it. Even after the association was removed it still contains the history. When you delete you get an error message. Given type {TagName} has references. In this case I know I have a hive_column that had this reference and I need to find the proper GUID in order to delete this reference and get this test setup out of my UI. First I do a search for these items above in the Atlas UI. Check the option for Show historical entities in the event yours has been deleted. The information is displayed in the UI and this is the one I am after. Next I will use the developer tools in chrome and generally I clear out any of the history to help reduce any confusion as to what I am looking for. Next click on ssn or your entity that was associated to a classification. The url is in the panel for the get request, and the GUID in the URL is what you are after. Delete the classification from the GUID. curl -iv -u tkreutzer \ -X DELETE http://yourhost:21000/api/atlas/v2/entity/guid/ec8a34c7-db67-41b8-a14c-32a19d2166bf/classification/SSNHR Now you can delete the Classification from the UI, barring there are no other associations. If so, rinse and repeat for each hive column. I will probably try to figure out a way to do this via REST with Python later in a way that finds all associated GUID's for us. Hope this help... Cheers
... View more
12-09-2019
05:05 PM
i found the link with bit of google search ... https://atlas.apache.org/0.8.4/Import-Export-API.html hope that is the one you referring.
... View more
01-07-2019
05:31 AM
@Owez Mujawar Could you run kafka console consumer on topic ATLAS_HOOK and ATLAS_ENTITIES when you create a table and check if the messages are flowing to the topic ?
... View more
08-29-2018
11:24 AM
It'd no longer be exaggerating the essay writing service term by way of announcing it carries all of the facts that can define one behavior with respect to everything in the international and that’s the cause it increasing exponentially and center of the enchantment of many articles, conferences, and conferences.
... View more
08-23-2018
10:55 AM
Thanks, it was authorization problem.
... View more
12-06-2018
01:19 PM
@Anpan K Yes , Atlas stores information in HBase. For fast information retrieval , the data need to be indexed and retrieved when queried. Atlas uses Janus Graph. Janus graph has 2 types of indexes : composite and mixed indexes. Composite indexes are supported with primary storage backend (HBase in this case), mixed graph indexes require indexing backend for full text search , numeric range search etc., Here , Solr is used by Atlas as indexing backend. You may read about Janus Graph for more detailed information.
... View more
03-11-2018
12:22 AM
Compactor calls getCompactionCompressionType() and uses it during compaction: this.compactionCompression = (this.store.getColumnFamilyDescriptor() == null) ? Compression.Algorithm.NONE : this.store.getColumnFamilyDescriptor().getCompactionCompressionType(); While getCompressionType() is used by the flusher - see code in DefaultStoreFlusher#flushSnapshot(): writer = store.createWriterInTmp(cellsCount, store.getColumnFamilyDescriptor().getCompressionType(), false, true, snapshot.isTagsPresent(), false);
... View more
03-09-2018
04:09 PM
@Alisha Vaz POST the attached create-tag.txt to following REST API to create a tag named PII http://localhost:21000/api/atlas/v2/types/typedefs?type=classification After tag creation , get the GUID of the entity in Atlas which has to be associated to the tag. If you want to associate to an hive_table entity by name employee which is in default database , fetch the GUID of the entity using DSL search : typename = hive_table , query = where qualifiedName="default.employee@cl1" select __guid Fetch the GUID and post the associate-tag.txt to the following REST API: http://localhost:21000/api/atlas/v2/entity/bulk/classification In the associate-tag.txt file , in "entityGuids" json array ,replace the existing GUID with the GUID fetched from the DSL search query. Please refer to http://atlas.apache.org/api/v2/resource_TypesREST.html for more information on creating,updating,deleting tags and http://atlas.apache.org/api/v2/resource_EntityREST.html#resource_EntityREST_addClassification_POST for associating tags to entities.
... View more
03-01-2018
12:08 PM
Thank You @Sharmadha Sainath , It is working Fine 🙂
... View more
04-22-2018
02:03 PM
you have guessed right. works now, thanks!
... View more
07-11-2017
12:18 PM
Thanks @Sharmadha Sainath
... View more
06-26-2017
07:40 AM
Thank you su much @Sharmadha Sainath! Your comment is very useful !!! I tested these commands : $KAFKA_HOME/bin/kafka-console-producer.sh --broker-list MY_HOSTNAME:6667 --topic ATLAS_HOOK
Then in Kafka Procuder console: {"version": {"version": "1.0.0"}, "message": {"entities": [{"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Reference", "id": {"jsonClass": "org.apache.atlas.typesystem.json.InstanceSerialization$_Id", "id": "-1467290565135246000", "version": 0, "typeName": "hdfs_path", "state": "ACTIVE"}, "typeName": "hdfs_path", "values": {"qualifiedName": "TestKafka", "owner": "admin", "description": "Test kafka", "path":"/user/data/testkafka.csv", posixPermissions": null, "createTime": "1970-01-01T00:00:00.000Z", "isSymlink": false, "extendedAttributes": null, "numberOfReplicas": 0, "name": "DataKFK"}, "traitNames": [], "traits": {} }], "type": "ENTITY_CREATE", "user": "admin"} }
So, Atlas notifies into ATLAS_ENTITIES topic this new entity. I could see it on Atlas UI ! Have a good day !
... View more
06-16-2017
04:18 PM
1 Kudo
These are good questions. I hope I am able to do justice to them with my answers. To elaborate little more on what @Sarath Subramanian said. Kafka is used to do the work of relaying the notifications from Hive to Atlas. Hive publishes to a topic and Atlas subscribes to that and thus receives the notifications. There has been some discussion on using Atlas for MySQL and Oracle. I have not seen any implementation yet. This is possible, provided these 2 products have notification mechanisms. From what I know, these have database change triggers that be used to call a REST API or push some message onto a queue or publish to Kafka. For Oracle, this is what i found. Hope this helps.
... View more
01-06-2017
08:41 PM
You can read messages to the console from a particular offset using the Simple Consumer CLI: https://cwiki.apache.org/confluence/display/KAFKA/System+Tools Search for Simple Consumer.
... View more
10-26-2016
01:12 AM
The article: Modify Atlas Entity properties using REST API commands contains a full description for how to update both the comment and description entity properties for Atlas managed hive_table types.
... View more
09-20-2016
05:05 AM
1 Kudo
Adding this dependency resolved the issue . <dependency>
<groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.7.3</version> </dependency> Thanks!
... View more