Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hadoop + Spark Use Case

avatar
Rising Star

There exists some use case that shows how Hadoop and Spark work together? I already read the theory but I want to see something pratical to have a better understand. Thanks!!!

1 ACCEPTED SOLUTION

avatar
Master Guru

Spark and Hadoop go together like peanut butter and jelly.

Check out my slides

https://community.hortonworks.com/content/idea/28342/apache-zeppelin-with-scala-spark-introduction-t...

https://community.hortonworks.com/content/kbentry/34784/data-ingest-with-apache-zeppelin-apache-spar...

I worked at a few places that used Spark and Spark streaming to ingest data into HDFS and HBase. Then Spark + Spark MLib and H20 to run machine learning on the data. Then Hive and Spark SQL for queries. And reporting through Hive Thrift server to Tableau.

Spark without Hadoop really is missing out a lot.

And Spark 1.6 on HDP you get all the benefits of running YARN applications, common security and locality of data access.

I wouldn't run Spark without Hadoop unless you are running Spark standalone for development.

Even there Zeppelin + Spark 1.6 on HDP is an awesome development environment.

View solution in original post

4 REPLIES 4

avatar
Expert Contributor

There is a good blog post over at MapR regarding this. I personally think the Network Security use case is especially compelling.

https://www.mapr.com/blog/game-changing-real-time-use-cases-apache-spark-on-hadoop.

avatar
Guru

If you think Hadoop to be HDFS and YARN, spark can take advantage of HDFS (storage that can be horizontally expanded by adding more nodes) by reading data that is in HDFS, writing final processed data into HDFS and YARN (compute that can be horizontally expanded by adding more nodes) by running on YARN.

If you are looking at usecases, look at MLlib algorithms which cover a lot of use cases that can run on top of spark.

avatar
Rising Star

Yes when I think about Hadoop I'm saying to storage the data into HDFS. I don't know what type of advantage that can I take with Spark. Data cleansing?

avatar
Master Guru

Spark and Hadoop go together like peanut butter and jelly.

Check out my slides

https://community.hortonworks.com/content/idea/28342/apache-zeppelin-with-scala-spark-introduction-t...

https://community.hortonworks.com/content/kbentry/34784/data-ingest-with-apache-zeppelin-apache-spar...

I worked at a few places that used Spark and Spark streaming to ingest data into HDFS and HBase. Then Spark + Spark MLib and H20 to run machine learning on the data. Then Hive and Spark SQL for queries. And reporting through Hive Thrift server to Tableau.

Spark without Hadoop really is missing out a lot.

And Spark 1.6 on HDP you get all the benefits of running YARN applications, common security and locality of data access.

I wouldn't run Spark without Hadoop unless you are running Spark standalone for development.

Even there Zeppelin + Spark 1.6 on HDP is an awesome development environment.