Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark sql and Hive tables

avatar

Hi, I installed Spark 1.1.0 and Hive 0.13, I try to run example code

# sc is an existing SparkContext.
from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)

sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
sqlContext.sql("LOAD DATA LOCAL INPATH 'examples/src/main/resources/kv1.txt' INTO TABLE src")

# Queries can be expressed in HiveQL.
results = sqlContext.sql("FROM src SELECT key, value").collect()

          so I get error:
Exception in thread "Thread-2" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at py4j.reflection.TypeUtil.getClass(TypeUtil.java:265)
at py4j.reflection.TypeUtil.forName(TypeUtil.java:245)
at py4j.commands.ReflectionCommand.getUnknownMember(ReflectionCommand.java:153)
at py4j.commands.ReflectionCommand.execute(ReflectionCommand.java:82)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 8 more

Everyone can help me?

1 ACCEPTED SOLUTION

avatar
Explorer

You need to have the hive client jars in your classpath.

View solution in original post

12 REPLIES 12

avatar

Please help me!

avatar
Explorer

You need to have the hive client jars in your classpath.

avatar

Thanks harsha_v,
but I don't know how to add the hive client jars in my classpath, can u explain clear more? 
thank you

avatar

copy ur jars to /usr/lib/spark/assembly/lib folder... then check...it should work

 

thanks,

Shekhar Reddy.

 

avatar
Expert Contributor

i tried to copy the jar " hive-common-0.13.1-cdh5.3.0.jar " - which contains " org.apache.hadoop.hive.conf.HiveConf " in 

" /usr/lib/spark/lib " but it still gives me the error :

java.lang.noclassdeffounderror: org.apache.hadoop.hive.conf.HiveConf 

 

 

avatar
Explorer

Hi Tarek, 

 

Did you manage to solve this issue?

 

I am facing the same issue here

 

Thanks 

avatar
Expert Contributor

yes i had to rebuild spark to be compatible with hive 

 

http://spark.apache.org/docs/1.2.0/building-spark.html

 

this section : 

Building With Hive and JDBC Support

avatar
Explorer

I have spark 1.2 (CDH 5.3.1)

do you think I need also to build myself?

avatar
Expert Contributor

that solved my problem , as the version build with cloudera isn't build to hive