Created on 11-26-2014 09:34 PM - edited 09-16-2022 02:14 AM
Hi, I installed Spark 1.1.0 and Hive 0.13, I try to run example code
# sc is an existing SparkContext. from pyspark.sql import HiveContext sqlContext = HiveContext(sc) sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)") sqlContext.sql("LOAD DATA LOCAL INPATH 'examples/src/main/resources/kv1.txt' INTO TABLE src") # Queries can be expressed in HiveQL. results = sqlContext.sql("FROM src SELECT key, value").collect()
so I get error:
Exception in thread "Thread-2" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at py4j.reflection.TypeUtil.getClass(TypeUtil.java:265)
at py4j.reflection.TypeUtil.forName(TypeUtil.java:245)
at py4j.commands.ReflectionCommand.getUnknownMember(ReflectionCommand.java:153)
at py4j.commands.ReflectionCommand.execute(ReflectionCommand.java:82)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 8 more
Everyone can help me?
Created 11-28-2014 07:02 AM
You need to have the hive client jars in your classpath.
Created 11-27-2014 07:20 PM
Please help me!
Created 11-28-2014 07:02 AM
You need to have the hive client jars in your classpath.
Created 11-28-2014 08:18 AM
Thanks harsha_v,
but I don't know how to add the hive client jars in my classpath, can u explain clear more?
thank you
Created 01-22-2015 05:38 AM
copy ur jars to /usr/lib/spark/assembly/lib folder... then check...it should work
thanks,
Shekhar Reddy.
Created 02-17-2015 06:12 AM
i tried to copy the jar " hive-common-0.13.1-cdh5.3.0.jar " - which contains " org.apache.hadoop.hive.conf.HiveConf " in
" /usr/lib/spark/lib " but it still gives me the error :
java.lang.noclassdeffounderror: org.apache.hadoop.hive.conf.HiveConf
Created 03-11-2015 06:30 AM
Hi Tarek,
Did you manage to solve this issue?
I am facing the same issue here
Thanks
Created 03-11-2015 06:52 AM
yes i had to rebuild spark to be compatible with hive
http://spark.apache.org/docs/1.2.0/building-spark.html
this section :
Building With Hive and JDBC Support
Created 03-11-2015 06:57 AM
I have spark 1.2 (CDH 5.3.1)
do you think I need also to build myself?
Created 03-11-2015 07:19 AM
that solved my problem , as the version build with cloudera isn't build to hive