Member since
08-16-2016
59
Posts
14
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2941 | 04-18-2017 10:48 PM | |
11201 | 01-17-2017 07:10 PM |
08-22-2018
07:04 PM
Is this happening for every entity ? Can you search on hive_db or hive_table and see if those are clickable ?
... View more
09-28-2017
07:53 PM
Your observation about the models is correct, if you create a JSON and place it under the models directory it'll be picked up during atlas startup. The links you're looking at for new model implementation are referring to V1 model (which is no longer being used), please refer to http://atlas.apache.org/api/v2/index.html#syntax_json for the new JSON structures. Let me know if you need more help on that.
... View more
09-27-2017
08:59 PM
What are the setup steps you followed for atlas ? There's some mention of TestNG and StormIT in the logs, not sure what kind of testing you're doing Atlas. If your atlas started successfully, can you share the logs ? Also what do you mean by installed atlas "natively" on ubuntu ?
... View more
09-22-2017
06:22 PM
@Yogeshprabhu Can you share the logs from atlas server too (application.log) ? Looks like Atlas is not able to post to ATLAS_ENTITIES topic (outgoing notification).
... View more
09-11-2017
06:28 PM
2 Kudos
Recently Atlas was enhanced to allow a UI friendly way of searching for entities present in atlas. The improvement has been introduced in two parts, one being the REST API ( /api/atlas/v2/search/basic ) and the other one is a rework of the search UI itself which adds a popup dialog for specifying the attributes of the type or the tag or both. Atlas Search details: http://atlas.apache.org/Search.html Search REST API: http://atlas.apache.org/api/v2/resource_DiscoveryREST.html#resource_DiscoveryREST_searchWithParameters_POST Here are some screenshots of the search experience. 1. Landing Page 2. Type-attribute based search Notice how the drop down changes when a different type is selected. 3. Tag-attribute based searches Each attribute type (string, boolean, int, float, date etc.) has a certain set of operators that can be used on the attributes. These details can be found at the link present in the previous paragraph. Once the search results are displayed, there's an option to dynamically retrieve more data (more columns in UI) about the searched entities. Here's what the selection drop-down would look like (look at the drop-down on the extreme right of the screenshot) Few things to note, 1. The total number of the results is unavailable at the moment as the underlying data store for Atlas doesn't provide that information (efficiently). So the user can only look at 25 results at a time but has the ability to navigate back and forth through pages. 2. User can search for entities by any combination of the three search facets (entity/type, tag and query-text) 3. Query text can be any text that the user is looking for within an entity and it can also consume Lucene style queries (if that's preferred) 4. For sake of simplicity and usability, only ACTIVE entities are listed by default. There's an option to include the deleted entities as well (checkbox called "include historical entities") 5. The search response time is heavily dependent on the specifics of the query i.e. a vague free-text like test* will take longer than something like test_table_123*
... View more
Labels:
08-24-2017
05:30 PM
Yeah running against external hbase and solr does require a lot of setup which is mostly done by Ambari in any HDP deployment.
... View more
08-14-2017
05:23 PM
If you do a mvn clean install -DskipTests -Pdist,berkeley-elasticsearch (assuming you want to run embedded dependencies) and navigate to distro/target/atlas-<version>-bin, you will be able to run atlas on the local machine using the start script (atlas_start.py) under the bin folder. If you're want to run against hbase and solr then you would need to have the following running before trying to start Atlas Zookeeper Hbase Kafka Solr (cloud mode) Having these running is the first step towards starting Atlas. Once these services are up and running, you'll have to make sure that Zookeeper has registered the HBase servers, Kafka brokers and Solr servers as well. The next step is to update the atlas-application.properties with the correct address/ URLs for the above services. Once that's done then Atlas should be able to start serving requests. PS: Running embedded mode is the fastest and easiest way to get Atlas up and running on a laptop/desktop. Hope that helps, if it does please upvote and accept the answer.
... View more
08-11-2017
07:28 PM
Could you share the atlas config? Looks like there's some failure while creating a new index.
... View more
08-03-2017
03:45 PM
Hi @Muhammad Imran Tariq, If you're looking for bringing in data FROM MySQL into Atlas, that can be done using sqoop.
... View more
08-02-2017
07:24 PM
In order to ingest the metadata for a SQL like source, you can use Sqoop which integrates with Atlas. If you're looking for using DynamoDB/MySQL as datastore for Atlas, that not currently possible with Titan 0.5.4 (only Hbase, cassandra and berkeleyDB are supported).
... View more