Created 04-26-2017 12:58 PM
Atlas is designed to exchange metadata with other tools and processes within and outside of the Hadoop stack, thereby enabling platform-agnostic governance controls that effectively address compliance requirements.
In Atlas, Is there any process which we can follow to exchange metadata betn different cluster or outside of cluster network??
Am using HDP2.5.0
Looking for a help!!
Created 04-26-2017 06:36 PM
Atlas usually relies on a Kafka notification for monitoring supported components like Hive, Falcon, Storm etc. As long as the Kafka topic ATLAS_HOOK exists and the tool/process integrating with atlas is able to post messages to that queue, the metadata ingest would work.
HTH
Created 05-03-2017 03:36 PM
@anaik Didn't get what you said.
I just want to test the metadata exchange mechanisim of Atlas in HDP2.5.0.
Do you have any examples/doc to follow ? Please share
Created 05-11-2017 05:16 PM
I'm assuming you want Atlas to exchange metadata with 3rd party governance tools. Is that the case? If so, Atlas allows communication and metadata exchange via it's REST interface. External tools would be responsible for pushing metadata into Atlas via the REST interface. The onus is also on the 3rd party tool to extract metadat from Atlas through the REST interface.
Usually what organizations wind up doing is writing scripts to manage this push/pull to/from Atlas. Take a look at the two links below on instructions on how to interact with the REST API to import/export Atlas metadata. The links reference Atlas 0.7 which comes with HDP 2.5.
https://atlas.incubator.apache.org/0.7.0-incubating/AtlasTechnicalUserGuide.pdf
We are working on building out-of-box integrations with some enterprise governance tools to provide seemless metadata exchange. At this point, integration is complete with Waterline (http://www.waterlinedata.com/), and we continue to work on others.