Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Error: Only one SparkContext may be running in this JVM

avatar
Expert Contributor

I am trying to run some spark streaming examples online. But even before I start, I'm getting this error

Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)

I tried this below but doesn't help.

conf.set("spark.driver.allowMultipleContexts","true");

Sample code I was trying to run in HDP 2.5

import org.apache.spark._
import org.apache.spark.streaming._

val conf = new SparkConf().setAppName(appName).setMaster(master)
val ssc = new StreamingContext(conf, Seconds(1))
1 ACCEPTED SOLUTION

avatar

Hi @Adnan Alvee,

Are you using spark-shell? if yes the spark context is already generated for you (as 'sc') and you don't need to create a new one, you should be able to directly go with:

val ssc = new StreamingContext(sc, Seconds(1))

View solution in original post

2 REPLIES 2

avatar

Hi @Adnan Alvee,

Are you using spark-shell? if yes the spark context is already generated for you (as 'sc') and you don't need to create a new one, you should be able to directly go with:

val ssc = new StreamingContext(sc, Seconds(1))

avatar
Expert Contributor

oh! that worked. Thanks a lot!