Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Flume HA

Solved Go to solution

Flume HA

Champion Alumni

Hello,

 

Is there a way to deploy Flume in HA?

 

 

Thank you,

 

GHERMAN Alina
1 ACCEPTED SOLUTION

Accepted Solutions

Re: Flume HA

Expert Contributor

So am I understanding correctly?

 

Data generating server -> HTTP Post -> Flume HTTP Source -> Flume sink -> etc.

 

and you want to make two Flume HTTP Source machines that can be written to and be able to recieve that data, in case one of them went down?  You also don't want to have to manage something like a load balancer/proxy in between the Data generating server and the Flume HTTP Source box?

 

If you can handle de-duplication on your back-end, then I think you could do this by sending the same data to two different Flume HTTP Source servers at the same time, possibly tagging the data in your sink to help you de-duplicate later.

 

 

Data generating server -> HTTP Post -> Flume HTTP Source - Flume sink with tag --> to de-duplication

               |

               -------------------->  HTTP Post -> Flume HTTP Source - Flume sink with tag --> to de-duplication

4 REPLIES 4

Re: Flume HA

Super Collaborator
Can you please define a little more what your use case or requirements are? Flume can replication of ingestion paths to ensure multiple copies downstream, and can also load balance between downstream flume agents to ensure deliver of events down multiple data flow paths.

-PD

Re: Flume HA

Champion Alumni

Hello,

 

Thank you for your reply,

 

In my case the flume source is HTTP, and I wanted to know if there is a way to ensure that if the machine with the flume source if getting down, I can still receive the data (HA).  

 

 

However, I can imagine only a solution with two sources and a load balancer machine before the 2 machines....and I was searching more for a solution within the Hadoop cluster (as it is done with YARN and HBase..)

 

 

 

Thank you,

 

Alina

GHERMAN Alina

Re: Flume HA

Expert Contributor

So am I understanding correctly?

 

Data generating server -> HTTP Post -> Flume HTTP Source -> Flume sink -> etc.

 

and you want to make two Flume HTTP Source machines that can be written to and be able to recieve that data, in case one of them went down?  You also don't want to have to manage something like a load balancer/proxy in between the Data generating server and the Flume HTTP Source box?

 

If you can handle de-duplication on your back-end, then I think you could do this by sending the same data to two different Flume HTTP Source servers at the same time, possibly tagging the data in your sink to help you de-duplicate later.

 

 

Data generating server -> HTTP Post -> Flume HTTP Source - Flume sink with tag --> to de-duplication

               |

               -------------------->  HTTP Post -> Flume HTTP Source - Flume sink with tag --> to de-duplication

Highlighted

Re: Flume HA

Champion Alumni

I'm note sure that I can change all the sources in order to post to all my Flume agents, but this is an interesting solution.

 

 

Thank you!

GHERMAN Alina