I am new to Apache Atlas.
We are planning to use Apache Atlas for metadata repository. The data is coming from different data sources like MySQL, Orcale, and CSV/JSON files.
Is there any help guide listing the steps to populate Apache Atlas metadata repository from a MySQL database?
Is it necessary to populate the data into Apache Hive from data sources or can I only populate metadata repository without actually fetching data from sources.
I am repeating my answer from the other thread. If you need implementation details, let me know.
First you create the new types in Atlas. For example, in the case of Oracle, and Oracle table type, column type, ect. You would then create a script or process that pulls the meta data from the source meta data store. Once you have the meta data you want to store in Atlas, your process would create the associated Atlas entities, based on the new types, using the Java API or JSON representations through the REST API directly. If you wanted to, you could add lineage to that as you store the new entities. That's it... now the meta data from the external meta data store is in Atlas.
I think sqoop is built for this specific scenario (or atleast I've used it to import metadata from mySQL database). Once you kickstart the sqoop import process, the atlas hook in sqoop will take care of creating correct types/entities in Atlas.
The below documentation has step by step details on how to use sqoop to move from any RDBMS to hive.
You can refer to this as well: http://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_literal_sqoop_import_all_tables_literal
To get the metadata of all this sqoop imported data in to Atlas, make sure the below configurations are set properly.
Please note the above configuration step is not needed if your cluster configuration is managed by Ambari. Hope this helps.