<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Error when using HiveContext: java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32363#M7652</link>
    <description>&lt;P&gt;Done! I reinstall a new version of Hive: hive 1.2.1. And the job is run well!&lt;/P&gt;</description>
    <pubDate>Sun, 27 Sep 2015 15:11:32 GMT</pubDate>
    <dc:creator>Dahn</dc:creator>
    <dc:date>2015-09-27T15:11:32Z</dc:date>
    <item>
      <title>Error when using HiveContext: java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32333#M7649</link>
      <description>&lt;P&gt;Hello. I am new to spark, but this error took me a lot of time. Please help me to pass this error.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;15/09/27 10:24:26 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.&lt;BR /&gt;Exception in thread "main" java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.defaultOverides(HiveContext.scala:175)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.&amp;lt;init&amp;gt;(HiveContext.scala:178)&lt;BR /&gt;at LoadHive2.main(LoadHive2.java:69)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:497)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;15/09/27 10:24:26 INFO spark.SparkContext: Invoking stop() from shutdown hook&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}&lt;BR /&gt;15/09/27 10:24:26 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My java code:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;import org.apache.spark.SparkContext;&lt;BR /&gt;import org.apache.spark.SparkConf;&lt;BR /&gt;import org.apache.spark.sql.SQLContext;&lt;BR /&gt;import org.apache.hadoop.hive.conf.HiveConf;&lt;BR /&gt;import org.apache.spark.sql.hive.HiveContext;&lt;/P&gt;&lt;P&gt;public class LoadHive2 {&lt;/P&gt;&lt;P&gt;public static void main(String[] args) {&lt;/P&gt;&lt;P&gt;SparkConf sparkConf = new SparkConf().setAppName("WordCount");&lt;BR /&gt;SparkContext sc = new SparkContext(sparkConf);&lt;/P&gt;&lt;P&gt;HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc);&lt;BR /&gt;sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)");&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;}//main&lt;BR /&gt;}//class&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My pom.xml:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;lt;project&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;groupId&amp;gt;LIMOS&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&amp;lt;artifactId&amp;gt;load-hive&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&amp;lt;modelVersion&amp;gt;4.0.0&amp;lt;/modelVersion&amp;gt;&lt;BR /&gt;&lt;BR /&gt;&amp;lt;name&amp;gt;LoadHive Project&amp;lt;/name&amp;gt;&lt;BR /&gt;&amp;lt;packaging&amp;gt;jar&amp;lt;/packaging&amp;gt;&lt;BR /&gt;&amp;lt;version&amp;gt;1.0&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;dependencies&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&amp;lt;artifactId&amp;gt;spark-core_2.10&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&amp;lt;version&amp;gt;1.4.0&amp;lt;/version&amp;gt;&lt;BR /&gt;&amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&amp;lt;artifactId&amp;gt;spark-sql_2.10&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&amp;lt;version&amp;gt;1.4.0&amp;lt;/version&amp;gt;&lt;BR /&gt;&amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&amp;lt;artifactId&amp;gt;spark-hive_2.11&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&amp;lt;version&amp;gt;1.5.0&amp;lt;/version&amp;gt;&lt;BR /&gt;&amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;/dependencies&amp;gt;&lt;BR /&gt;&amp;lt;/project&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My system install: Hadoop 2.7.1, hive 0.14.0,&amp;nbsp;spark-1.5.0-bin-hadoop2.6. I am using mysql for hive metastore. &amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The above code was built successfully to create a jar file using maven. But when I submited this jar using the following command, the above error occured:&lt;BR /&gt;&lt;BR /&gt;home/user/spark-1.5.0-bin-hadoop2.6/bin/spark-submit --class "LoadHive2" --master spark://10.0.2.10:7077 target/load-hive-1.0.jar&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;It is worth noting that when I tested some examples that were not using HiveContext, the jobs were run well on the Spark cluster. For hive, I can also access it to create or run sql in hive.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am fooking the solution in the following direction: Hive attribute&amp;nbsp;&lt;SPAN&gt;HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS is not&amp;nbsp;recognized by the current version of spark 1.5.0 and hive 0.14.0. Although this attribute was&amp;nbsp;recognized be the compiler that built the jar file, it was not &lt;SPAN&gt;recognized by execution engine&lt;/SPAN&gt;. (It mean that the compiler and the execution engine maybe are not the same).&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;But so far I could not fix it. It make me so&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;headache! Please help me! Thank you in advance.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 09:41:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32333#M7649</guid>
      <dc:creator>Dahn</dc:creator>
      <dc:date>2022-09-16T09:41:55Z</dc:date>
    </item>
    <item>
      <title>Re: Error when using HiveContext: java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32334#M7650</link>
      <description>Here, you're using your own build of Spark against an older version of&lt;BR /&gt;Hive than what's in CDH. That might mostly work, but you're seeing the&lt;BR /&gt;problems in compiling and running vs different versions. I'm afraid&lt;BR /&gt;you're on your own if you're rolling your own build, but, I expect you&lt;BR /&gt;may get much closer if you make a build targeting the same HIve&lt;BR /&gt;version in CDH.&lt;BR /&gt;&lt;BR /&gt;</description>
      <pubDate>Sun, 27 Sep 2015 10:09:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32334#M7650</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-09-27T10:09:55Z</dc:date>
    </item>
    <item>
      <title>Re: Error when using HiveContext: java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32335#M7651</link>
      <description>&lt;P&gt;Hi &amp;nbsp;Srowen. Thanks so much for your comment.&amp;nbsp;You are right. I am new in this area. So far I have only used binary installations of spark and hive and then configured them to work together. I will check the consistent between versions. &amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 27 Sep 2015 10:38:37 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32335#M7651</guid>
      <dc:creator>Dahn</dc:creator>
      <dc:date>2015-09-27T10:38:37Z</dc:date>
    </item>
    <item>
      <title>Re: Error when using HiveContext: java.lang.NoSuchFieldError: HIVE_SUPPORT_SQL11_RESERVED_KEYWORDS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32363#M7652</link>
      <description>&lt;P&gt;Done! I reinstall a new version of Hive: hive 1.2.1. And the job is run well!&lt;/P&gt;</description>
      <pubDate>Sun, 27 Sep 2015 15:11:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Error-when-using-HiveContext-java-lang-NoSuchFieldError-HIVE/m-p/32363#M7652</guid>
      <dc:creator>Dahn</dc:creator>
      <dc:date>2015-09-27T15:11:32Z</dc:date>
    </item>
  </channel>
</rss>

