Created on 01-05-2015 10:10 AM - edited 09-16-2022 02:17 AM
We have just started with cloudera. I have a question about spark. Does CDH 5.3.x supports Hive Context?
I have installed CDH 5.3. When I try to get the hive context on scala editor. I get the following error
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
error: bad symbolic reference. A signature in HiveContext.class refers to term hive in package org.apache.hadoop which is not available. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling HiveContext.class. error:
Thanks,
Satya K
Created 01-07-2015 04:09 PM
Created 01-07-2015 04:16 PM
Ok. Thank you !
Created 01-08-2015 02:17 AM
Not supported, but, all of the standard bits are there. It should work just like any other installation. You will probably have to put the Hive jars on your classpath manually, says Marcelo.
Created 02-17-2015 01:30 AM
can you please tell me how to add the hive jars in the classpath ?
Created 02-17-2015 02:34 AM
Created 02-17-2015 04:34 AM
okey i will try this thanks so much , but i have another question if you please, i add the external jars i need , all of them work normally , but the
" org.apache.hadoop.hive.conf.HiveConf" which exists in hive-common-0.13.1-cdh5.3.0.jar , it gives the error " calss not found " so why that happens ?
the command i run :
sudo spark-submit --class "WordCount" --master local[*] --jars /usr/local/WordCount/target/scala-2.10/spark-streaming-flume_2.11-1.2.0.jar,/usr/lib/avro/avro-ipc-1.7.6-cdh5.3.0.jar,/usr/lib/flume-ng/lib/flume-ng-sdk-1.5.0-cdh5.3.0.jar,/usr/lib/hive/lib/hive-common-0.13.1-cdh5.3.0.jar,/usr/local/WordCount/target/scala-2.10/spark-hive_2.10-1.2.0-cdh5.3.0.jar /usr/local/WordCount/target/scala-2.10/wordcount_2.10-1.0.jar 127.0.0.1 9999