- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Is Kafka necessary to import metadada into Atlas?
- Labels:
-
Apache Atlas
-
Apache Hive
-
Apache Kafka
Created ‎05-03-2016 07:29 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am learning Apache Atlas and read the demo on https://github.com/shivajid/atlas/blob/master/tutorial/Step1.md.
This tutorial shows that import metadata into Atlas without using Kafka.
So, Kafka is not necessary when we want to import metadata from Hive to Atlas, right?
And from the docs of http://atlas.apache.org/Architecture.html, I realize that there will have some failure of communication between Hook and Atlas because of network issues and hence inconsistent of metadata. The Kafka would avoid this inconsistency happen, right?
What other function or benefit if we use Kafka in the communication between Hook and Atlas?
Thank you very much.
Created ‎05-03-2016 07:33 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Ethan
No kafka is not necessary when importing data into atlas. Atals will actually listen in to services like, Hive,Sqoop, Falcon etc... to automatically import data. You can also interact with the Atlas APIs, rest or no, to import your own data, say tags for example.
Kafka is very useful for example in the communication with ranger for security policies. As you add tags to data in Atlas you want Ranger to pick them up as soon as possible and kafka is that gateway.
Kafka in the not so distant future will also be a service monitored by Atlas as it also is a gateway for data inside hadoop and as such is a source Atlas should do governance for.
hope this helps
Created ‎05-03-2016 07:33 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Ethan
No kafka is not necessary when importing data into atlas. Atals will actually listen in to services like, Hive,Sqoop, Falcon etc... to automatically import data. You can also interact with the Atlas APIs, rest or no, to import your own data, say tags for example.
Kafka is very useful for example in the communication with ranger for security policies. As you add tags to data in Atlas you want Ranger to pick them up as soon as possible and kafka is that gateway.
Kafka in the not so distant future will also be a service monitored by Atlas as it also is a gateway for data inside hadoop and as such is a source Atlas should do governance for.
hope this helps
Created ‎05-06-2016 12:43 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you very much.
