When I try to create hiveContext from spark-shell, I get it done but same statements when packed in jar(scala source) throws :Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset when I try to run it with spark-submit.
i am running it with:
spark-submit --class com.mycom.MYApp --master local[4] --driver-class-path /usr/hdp/2.3.6.0-3796/spark/tmplib/spark-hive_2.10-1.5.2.jar /tmp/temp/test.jar
code:
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("sparkSQL")
val sparkContext = new SparkContext(sparkConf)
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext)
val sqlContext = new org.apache.spark.sql.SQLContext(sparkContext)
val hdf = hiveContext.sql("select * from contracts")
hdf.printSchema()
val mdf = sqlContext.read.format("com.mongodb.spark.sql.DataSource").option("uri", "mongodb://username:password@host:27008/DBname.collectionname").load()
mdf.printSchema()
}
I already tried running spark-submit with --driver-class-path also with --jars and --files /path/to/hive-site.xml but no luck 😞
I am running spark 1.5.2 and scala 2.10
Help please.