Created 09-09-2017 11:27 AM
When I try to create hiveContext from spark-shell, I get it done but same statements when packed in jar(scala source) throws :Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset when I try to run it with spark-submit.
i am running it with:
spark-submit --class com.mycom.MYApp --master local[4] --driver-class-path /usr/hdp/2.3.6.0-3796/spark/tmplib/spark-hive_2.10-1.5.2.jar /tmp/temp/test.jar
code:
def main(args: Array[String]) { val sparkConf = new SparkConf().setAppName("sparkSQL") val sparkContext = new SparkContext(sparkConf) val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext) val sqlContext = new org.apache.spark.sql.SQLContext(sparkContext) val hdf = hiveContext.sql("select * from contracts") hdf.printSchema() val mdf = sqlContext.read.format("com.mongodb.spark.sql.DataSource").option("uri", "mongodb://username:password@host:27008/DBname.collectionname").load() mdf.printSchema() }
I already tried running spark-submit with --driver-class-path also with --jars and --files /path/to/hive-site.xml but no luck 😞
I am running spark 1.5.2 and scala 2.10
Help please.
Created 09-11-2017 09:09 AM
Hi @Mukesh Burman,
I saw in you code that you refer to mongoDB. Did you add all the necessary library for mongoDB?
Michel
Created 09-12-2017 05:36 AM
Hi @msumbul,
Thanks for your reply,
This issue was mainly due to multiple version of spark was installed and was not properly configured.
Due to that, spark-shell was getting called from spark 1.5 and spark-submit was getting called from spark 1.3.
I installed spark 2.2 and made 2.2 as major version. And issue is resolved.
Thanks again.