Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error in using both Spark context and streaming context in the same program

Error in using both Spark context and streaming context in the same program

Explorer

Hi ,

 

I created a program in scala for joining a stream and a batch file, In the program I am using both Sparkcontext and streaming context, Following is the code,

 

val sparkconf = new SparkConf().setAppName("Streaming and Batch Join")

val sc = new SparkContext(sparkconf)
 val ssc = new StreamingContext(SparkContext, Seconds(10) )

 But StreamingContext is throwing error as following :

 

overloaded method constructor StreamingContext with alternatives: (path: String,hadoopConf:
org.apache.hadoop.conf.Configuration)org.apache.spark.streaming.StreamingContext <and> (conf:
org.apache.spark.SparkConf,batchDuration:
org.apache.spark.streaming.Duration)org.apache.spark.streaming.StreamingContext <and> (sparkContext:
org.apache.spark.SparkContext,batchDuration:
org.apache.spark.streaming.Duration)org.apache.spark.streaming.StreamingContext cannot be applied to
(org.apache.spark.SparkContext.type, org.apache.spark.streaming.Duration), 

 

Is there any other way i can use both streaming and spark context.

 

Thanks,

Arun

1 REPLY 1

Re: Error in using both Spark context and streaming context in the same program

Master Collaborator

You mean to pass "sc", the actual SparkContext, as the first argument to the StreamingContext constructor. SparkContext is just the object/class. That's what the compiler error is saying.