Atlas is designed to exchange metadata with other tools and processes within and outside of the Hadoop stack, thereby enabling platform-agnostic governance controls that effectively address compliance requirements.
In Atlas, Is there any process which we can follow to exchange metadata betn different cluster or outside of cluster network??
Am using HDP2.5.0
Looking for a help!!
Atlas usually relies on a Kafka notification for monitoring supported components like Hive, Falcon, Storm etc. As long as the Kafka topic ATLAS_HOOK exists and the tool/process integrating with atlas is able to post messages to that queue, the metadata ingest would work.
I'm assuming you want Atlas to exchange metadata with 3rd party governance tools. Is that the case? If so, Atlas allows communication and metadata exchange via it's REST interface. External tools would be responsible for pushing metadata into Atlas via the REST interface. The onus is also on the 3rd party tool to extract metadat from Atlas through the REST interface.
Usually what organizations wind up doing is writing scripts to manage this push/pull to/from Atlas. Take a look at the two links below on instructions on how to interact with the REST API to import/export Atlas metadata. The links reference Atlas 0.7 which comes with HDP 2.5.
We are working on building out-of-box integrations with some enterprise governance tools to provide seemless metadata exchange. At this point, integration is complete with Waterline (http://www.waterlinedata.com/), and we continue to work on others.