Support Questions
Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Populate metadata repository from RDBMS in Apache Atlas



I am new to Apache Atlas.

We are planning to use Apache Atlas for metadata repository. The data is coming from different data sources like MySQL, Orcale, and CSV/JSON files.

Is there any help guide listing the steps to populate Apache Atlas metadata repository from a MySQL database?

Is it necessary to populate the data into Apache Hive from data sources or can I only populate metadata repository without actually fetching data from sources.





I am repeating my answer from the other thread. If you need implementation details, let me know.

First you create the new types in Atlas. For example, in the case of Oracle, and Oracle table type, column type, ect. You would then create a script or process that pulls the meta data from the source meta data store. Once you have the meta data you want to store in Atlas, your process would create the associated Atlas entities, based on the new types, using the Java API or JSON representations through the REST API directly. If you wanted to, you could add lineage to that as you store the new entities. That's it... now the meta data from the external meta data store is in Atlas.


@Vadim Vaks would you be kind enough to provide the implementation details for the above?

Thanking you in anticipation.

would you be like to provide the implementation details?

Rising Star

I think sqoop is built for this specific scenario (or atleast I've used it to import metadata from mySQL database). Once you kickstart the sqoop import process, the atlas hook in sqoop will take care of creating correct types/entities in Atlas.


@anaik thank you for the response.

Would you be kind enough to provide a step by step guide on how to import metadata from mySQL database to Atlas if possible?


The below documentation has step by step details on how to use sqoop to move from any RDBMS to hive.

You can refer to this as well:

To get the metadata of all this sqoop imported data in to Atlas, make sure the below configurations are set properly.

Please note the above configuration step is not needed if your cluster configuration is managed by Ambari. Hope this helps.


Thank you @Ayub Khan, I am using an embedded hbase-solr configuration of Atlas so I presume I need to set the configurations manually.

@Tariq Did you perform the manual configuration? If this issue is resolved, please close this thread. thanks.