Support Questions

Find answers, ask questions, and share your expertise

Hello! I'm trying to run Spark classifier using scala on my machine.But am getting the following error.I need help to fix it.

New Contributor

Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true.

1 REPLY 1

Expert Contributor

You are trying to create another SparkContext. Please use the existing one.

In `spark-shell`, `sc` is the SparkContext which Spark created for you.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.