Reply
Highlighted
Explorer
Posts: 62
Registered: ‎01-22-2014

Error in using both Spark context and streaming context in the same program

Hi ,

 

I created a program in scala for joining a stream and a batch file, In the program I am using both Sparkcontext and streaming context, Following is the code,

 

val sparkconf = new SparkConf().setAppName("Streaming and Batch Join")

val sc = new SparkContext(sparkconf)
 val ssc = new StreamingContext(SparkContext, Seconds(10) )

 But StreamingContext is throwing error as following :

 

overloaded method constructor StreamingContext with alternatives: (path: String,hadoopConf:
org.apache.hadoop.conf.Configuration)org.apache.spark.streaming.StreamingContext <and> (conf:
org.apache.spark.SparkConf,batchDuration:
org.apache.spark.streaming.Duration)org.apache.spark.streaming.StreamingContext <and> (sparkContext:
org.apache.spark.SparkContext,batchDuration:
org.apache.spark.streaming.Duration)org.apache.spark.streaming.StreamingContext cannot be applied to
(org.apache.spark.SparkContext.type, org.apache.spark.streaming.Duration), 

 

Is there any other way i can use both streaming and spark context.

 

Thanks,

Arun

Cloudera Employee
Posts: 481
Registered: ‎08-11-2014

Re: Error in using both Spark context and streaming context in the same program

You mean to pass "sc", the actual SparkContext, as the first argument to the StreamingContext constructor. SparkContext is just the object/class. That's what the compiler error is saying.