08-09-2016 02:36 PM
I am getting error while importing KafkaUtils class
scala> import org.apache.spark.streaming.kafka.KafkaUtils
<console>:13: error: object kafka is not a member of package org.apache.spark.streaming
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
can anyne help.what is wrong in this
08-09-2016 03:43 PM
You have a quite old version of Spark there by the way.
You're showing interaction with the shell, but referring to a POM file, which is for a compiled app.
You need to add the JARs to the spark-shell command-line in general to access them.
I think in this old version of Spark the Kafka stuff was actually present in the examples uber jar; maybe just reference that.
08-10-2016 08:38 AM
i have created a depndancy jar file and used that in spark shell with --jars option , it added fine but while importing its gicing this error.
is this is aversion issue . spark-kafka streaming will work after CDH5.5 only ?
08-10-2016 08:43 AM
In that case I think it's a version problem. You have a very old version of Spark that may not even have this class. It's nothing to do with CDH per se.
Actually: you shouldn't be packaging Spark with your app at all. And, you should find that this class is already part of the main assembly in more recent Spark versions. What if you omit this entirely and try the import?
I just tried importing this in spark-shell in CDH 5.8 and it was available, without any additional jars.