Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Is it mandatory to start Hadoop to run spark application

Is it mandatory to start Hadoop to run spark application

New Contributor
 
1 REPLY 1

Re: Is it mandatory to start Hadoop to run spark application

@sims soni,

Can you please elaborate more on "Start Hadoop? ".

For basic use cases of spark , you need to have HDFS, YARN, MapReduce and Spark of-course to be up and running.

For spark-sql , you need Hive to be running.