I am trying to set up a ambari cluster for Kafka in setting up as Active-Active cluster.
What was the best way of setting up this?
Storm application writes to Kafka topic to two clusters ( Cluster A & cluster B which are active-active) both cluster will have the same data and same kafak topic name. when we are consuming this data, from both clusters from a Kafka topic, how the offset will behave here?
What is the best scenario in setting up this Rededent cluster with active, active, so if one Kafka cluster is down other will be available in reading the data?
Any help/workarounds would be appreciated.
I believe you can use Kafka mirror maker for this scenario, where you can write data to one cluster and replicate the messages to the other.
I hope that helps.
Thank your inputs.
I am still not clear about a few things.
Mirror maker in just taking a copy of data from one cluster to another cluster.
but here a scenario.
My application is writing the data into a Cluster A and on cluster B we set up a mirror maker in copying data from Cluster A to Cluster B. What if Cluster A is down?
and on the other side, consuming data we need to either to read from both clusters if cluster A down then we need to consume the data from Cluster B.
Not sure what mechanism in writing and reading the data?
May be Storm application need to write data to two clusters A, B, and reading need to be the same as well from both cluster. But was not sure how offsets will behave here when consuming the data from both clusters?
maybe adding an advertise listener with a common IP address can be readable;le from cluster?
I would say that the logic has to be in the application. If this is because of fault-tolerant or high availability, you can use 1 single cluster with multiple hosts and replication factor >= 3 to replicate the topic's data among hosts. If one broker is down, then you have other 2 machines to replace the topic leader and continue working.