Support Questions

Find answers, ask questions, and share your expertise

spark.streaming.kafka.KafkaUtils error

avatar
Expert Contributor

Hi ,

I am getting error while importing KafkaUtils class 

 

scala> import org.apache.spark.streaming.kafka.KafkaUtils
<console>:13: error: object kafka is not a member of package org.apache.spark.streaming
import org.apache.spark.streaming.kafka.KafkaUtils

 

pom.xml 

 

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>spark</groupId>
<artifactId>spark-kafka</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>

<name>spark-kafka</name>
<url>http://maven.apache.org</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.1.0</version>
</dependency>
</dependencies>
</project>

 

 

can anyne help.what is wrong in this 

1 ACCEPTED SOLUTION

avatar
Master Collaborator

In that case I think it's a version problem. You have a very old version of Spark that may not even have this class. It's nothing to do with CDH per se.


Actually: you shouldn't be packaging Spark with your app at all. And, you should find that this class is already part of the main assembly in more recent Spark versions. What if you omit this entirely and try the import?


I just tried importing this in spark-shell in CDH 5.8 and it was available, without any additional jars.

View solution in original post

4 REPLIES 4

avatar
Master Collaborator

You have a quite old version of Spark there by the way.

You're showing interaction with the shell, but referring to a POM file, which is for a compiled app.

You need to add the JARs to the spark-shell command-line in general to access them.

I think in this old version of Spark the Kafka stuff was actually present in the examples uber jar; maybe just reference that.

avatar
Expert Contributor

i have created a depndancy jar file and used that in spark shell with --jars option , it added fine but while importing its gicing this error.

 

is this is aversion issue . spark-kafka streaming will work after CDH5.5 only ?

avatar
Master Collaborator

In that case I think it's a version problem. You have a very old version of Spark that may not even have this class. It's nothing to do with CDH per se.


Actually: you shouldn't be packaging Spark with your app at all. And, you should find that this class is already part of the main assembly in more recent Spark versions. What if you omit this entirely and try the import?


I just tried importing this in spark-shell in CDH 5.8 and it was available, without any additional jars.

avatar
Expert Contributor

Thanks i will move to latest version then.