I created a program in scala for joining a stream and a batch file, In the program I am using both Sparkcontext and streaming context, Following is the code,
val sparkconf = new SparkConf().setAppName("Streaming and Batch Join") val sc = new SparkContext(sparkconf) val ssc = new StreamingContext(SparkContext, Seconds(10) )
But StreamingContext is throwing error as following :
overloaded method constructor StreamingContext with alternatives: (path: String,hadoopConf:
org.apache.hadoop.conf.Configuration)org.apache.spark.streaming.StreamingContext <and> (conf:
org.apache.spark.streaming.Duration)org.apache.spark.streaming.StreamingContext <and> (sparkContext:
org.apache.spark.streaming.Duration)org.apache.spark.streaming.StreamingContext cannot be applied to
Is there any other way i can use both streaming and spark context.
You mean to pass "sc", the actual SparkContext, as the first argument to the StreamingContext constructor. SparkContext is just the object/class. That's what the compiler error is saying.