Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hi,I'm supposed to set up an architecture about real time analytics for my Master Thesis.

avatar

So I installed Hortonworks & would like to use: Postgressql (Data source) put the data into kafka and then from kafka to Spark Streaming and Finally From Spark Streaming to stock them into Hbase.I dont know how to begin nor how to use the Ambari Postgres.Does anyone have some advices to give me?Rhank you in advance.

1 ACCEPTED SOLUTION

avatar
Guru

Hi @Mamy D

I would look at the various tutorials that are available at Hortonworks.com that go through real-time analytics and try them on your installed cluster. You may also want to include Hortonworks Data Flow. A good one to get started with Kafka may be: https://hortonworks.com/tutorial/realtime-event-processing-in-hadoop-with-nifi-kafka-and-storm/

Best of luck on your journey.

View solution in original post

1 REPLY 1

avatar
Guru

Hi @Mamy D

I would look at the various tutorials that are available at Hortonworks.com that go through real-time analytics and try them on your installed cluster. You may also want to include Hortonworks Data Flow. A good one to get started with Kafka may be: https://hortonworks.com/tutorial/realtime-event-processing-in-hadoop-with-nifi-kafka-and-storm/

Best of luck on your journey.