Support Questions

Find answers, ask questions, and share your expertise

Error when using HiveContext: java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS

avatar
Explorer

Hello. I am new to spark, but this error took me a lot of time. Please help me to pass this error. 

 

15/09/27 10:24:26 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
Exception in thread "main" java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS
at org.apache.spark.sql.hive.HiveContext.defaultOverides(HiveContext.scala:175)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:178)
at LoadHive2.main(LoadHive2.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 10:24:26 INFO spark.SparkContext: Invoking stop() from shutdown hook
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}

 

 

My java code:

 

import org.apache.spark.SparkContext;
import org.apache.spark.SparkConf;
import org.apache.spark.sql.SQLContext;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.spark.sql.hive.HiveContext;

public class LoadHive2 {

public static void main(String[] args) {

SparkConf sparkConf = new SparkConf().setAppName("WordCount");
SparkContext sc = new SparkContext(sparkConf);

HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc);
sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)");


}//main
}//class

 

My pom.xml:

 

<project>

<groupId>LIMOS</groupId>
<artifactId>load-hive</artifactId>
<modelVersion>4.0.0</modelVersion>

<name>LoadHive Project</name>
<packaging>jar</packaging>
<version>1.0</version>

<dependencies>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.4.0</version>
</dependency>

 

<dependency>

<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>1.5.0</version>
</dependency>

</dependencies>
</project>

 

My system install: Hadoop 2.7.1, hive 0.14.0, spark-1.5.0-bin-hadoop2.6. I am using mysql for hive metastore.  

 

The above code was built successfully to create a jar file using maven. But when I submited this jar using the following command, the above error occured:

home/user/spark-1.5.0-bin-hadoop2.6/bin/spark-submit --class "LoadHive2" --master spark://10.0.2.10:7077 target/load-hive-1.0.jar

It is worth noting that when I tested some examples that were not using HiveContext, the jobs were run well on the Spark cluster. For hive, I can also access it to create or run sql in hive. 

 

I am fooking the solution in the following direction: Hive attribute HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS is not recognized by the current version of spark 1.5.0 and hive 0.14.0. Although this attribute was recognized be the compiler that built the jar file, it was not recognized by execution engine. (It mean that the compiler and the execution engine maybe are not the same).

But so far I could not fix it. It make me so headache! Please help me! Thank you in advance.

 

 

 

1 ACCEPTED SOLUTION

avatar
Master Collaborator
Here, you're using your own build of Spark against an older version of
Hive than what's in CDH. That might mostly work, but you're seeing the
problems in compiling and running vs different versions. I'm afraid
you're on your own if you're rolling your own build, but, I expect you
may get much closer if you make a build targeting the same HIve
version in CDH.

View solution in original post

3 REPLIES 3

avatar
Master Collaborator
Here, you're using your own build of Spark against an older version of
Hive than what's in CDH. That might mostly work, but you're seeing the
problems in compiling and running vs different versions. I'm afraid
you're on your own if you're rolling your own build, but, I expect you
may get much closer if you make a build targeting the same HIve
version in CDH.

avatar
Explorer

Hi  Srowen. Thanks so much for your comment. You are right. I am new in this area. So far I have only used binary installations of spark and hive and then configured them to work together. I will check the consistent between versions.  

avatar
Explorer

Done! I reinstall a new version of Hive: hive 1.2.1. And the job is run well!