Support Questions

Find answers, ask questions, and share your expertise

Populate metadata repository from RDBMS in Apache Atlas

avatar
Explorer

Hi,

I am new to Apache Atlas.

We are planning to use Apache Atlas for metadata repository. The data is coming from different data sources like MySQL, Orcale, and CSV/JSON files.

Is there any help guide listing the steps to populate Apache Atlas metadata repository from a MySQL database?

Is it necessary to populate the data into Apache Hive from data sources or can I only populate metadata repository without actually fetching data from sources.

Regards,

8 REPLIES 8

avatar
Guru

@Tariq

I am repeating my answer from the other thread. If you need implementation details, let me know.

First you create the new types in Atlas. For example, in the case of Oracle, and Oracle table type, column type, ect. You would then create a script or process that pulls the meta data from the source meta data store. Once you have the meta data you want to store in Atlas, your process would create the associated Atlas entities, based on the new types, using the Java API or JSON representations through the REST API directly. If you wanted to, you could add lineage to that as you store the new entities. That's it... now the meta data from the external meta data store is in Atlas.

avatar
Rising Star

@Vadim Vaks would you be kind enough to provide the implementation details for the above?

Thanking you in anticipation.

avatar

would you be like to provide the implementation details?

avatar
Expert Contributor

I think sqoop is built for this specific scenario (or atleast I've used it to import metadata from mySQL database). Once you kickstart the sqoop import process, the atlas hook in sqoop will take care of creating correct types/entities in Atlas.

HTH

avatar
Rising Star
@anaik thank you for the response.

Would you be kind enough to provide a step by step guide on how to import metadata from mySQL database to Atlas if possible?

avatar

@Tariq

The below documentation has step by step details on how to use sqoop to move from any RDBMS to hive.

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_data-access/content/using_sqoop_to_move_...

You can refer to this as well: http://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_literal_sqoop_import_all_tables_literal

To get the metadata of all this sqoop imported data in to Atlas, make sure the below configurations are set properly.

http://atlas.incubator.apache.org/Bridge-Sqoop.html

Please note the above configuration step is not needed if your cluster configuration is managed by Ambari. Hope this helps.

avatar
Rising Star

Thank you @Ayub Khan, I am using an embedded hbase-solr configuration of Atlas so I presume I need to set the configurations manually.

avatar

@Tariq Did you perform the manual configuration? If this issue is resolved, please close this thread. thanks.