<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark job failed when new HiveContext object in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141962#M104555</link>
    <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/12972/jay2zhou.html" nodeid="12972"&gt;@Jay Zhou&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Can you be a bit more specific what you have changed? What did you exactly do with this line?&lt;/P&gt;&lt;PRE&gt;val hiveSqlContext = new org.apache.spark.sql.hive.HiveContext(sc)&lt;/PRE&gt;&lt;P&gt;I have a similar problem where I get an error &lt;/P&gt;&lt;P&gt;WARN Hive: Failed to access metastore. This class should not accessed in runtime.&lt;/P&gt;&lt;P&gt;but this is only when I run the job via Oozie. When I use spark submit the code works so I guess the dependencies are right.&lt;/P&gt;&lt;P&gt;Do you have any idea what can cause this?&lt;/P&gt;</description>
    <pubDate>Thu, 20 Apr 2017 12:51:54 GMT</pubDate>
    <dc:creator>jiiiiken88</dc:creator>
    <dc:date>2017-04-20T12:51:54Z</dc:date>
    <item>
      <title>Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141946#M104539</link>
      <description>&lt;P&gt;We are using HDP 2.3.4. I also followed the instructions below. &lt;/P&gt;&lt;LI&gt;spark-submit \&lt;/LI&gt;&lt;LI&gt;--class &amp;lt;Your.class.name&amp;gt; \&lt;/LI&gt;&lt;LI&gt;--master yarn-cluster \&lt;/LI&gt;&lt;LI&gt;--num-executors 1 \&lt;/LI&gt;&lt;LI&gt;--driver-memory 1g \&lt;/LI&gt;&lt;LI&gt;--executor-memory 1g \&lt;/LI&gt;&lt;LI&gt;--executor-cores 1 \&lt;/LI&gt;&lt;LI&gt;--files /usr/hdp/current/spark-client/conf/hive-site.xml \&lt;/LI&gt;&lt;LI&gt;--jars /usr/hdp/current/spark-client/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-3.2.10.jar \&lt;/LI&gt;&lt;LI&gt;target/YOUR_JAR-1.0.0-SNAPSHOT.jar "show tables""select * from your_table"&lt;/LI&gt;&lt;P&gt;Here is the callstack:&lt;/P&gt;&lt;P&gt;16/09/06 15:20:35 WARN Hive: Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
       at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
       at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
       at org.apache.hadoop.hive.ql.metadata.Hive.&amp;lt;clinit&amp;gt;(Hive.java:166)
       at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
       at org.apache.spark.sql.hive.client.ClientWrapper.&amp;lt;init&amp;gt;(ClientWrapper.scala:193)
       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
       at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
       at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
       at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
       at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
       at org.apache.spark.sql.hive.client.IsolatedClientLoader.&amp;lt;init&amp;gt;(IsolatedClientLoader.scala:179)
       at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:228)
       at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:187)
       at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:394)
       at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:176)
       at org.apache.spark.sql.hive.HiveContext.&amp;lt;init&amp;gt;(HiveContext.scala:179)
       at com.cbt.ingest.tsz.TSZIngestApp$delayedInit$body.apply(TSZIngestApp.scala:50)
       at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
       at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
       at scala.App$anonfun$main$1.apply(App.scala:71)
       at scala.App$anonfun$main$1.apply(App.scala:71)
       at scala.collection.immutable.List.foreach(List.scala:318)
       at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 22:24:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141946#M104539</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-07T22:24:28Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141947#M104540</link>
      <description>&lt;P&gt;Could you post your spark code?&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 23:52:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141947#M104540</guid>
      <dc:creator>Carolyn</dc:creator>
      <dc:date>2016-09-07T23:52:40Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141948#M104541</link>
      <description>&lt;P&gt;Also, could you post any additional context in the stack trace.  Are there additional exceptions?&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 23:56:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141948#M104541</guid>
      <dc:creator>Carolyn</dc:creator>
      <dc:date>2016-09-07T23:56:55Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141949#M104542</link>
      <description>&lt;P&gt;It failed at the beginning of my code to new a HiveContext object...&lt;/P&gt;&lt;P&gt;  log.warn("Running Master: " + master.toString())
  val sparkConf = new SparkConf().setAppName(APP_NAME)
                                 .setMaster(master)
  val sc = SparkContext.getOrCreate(sparkConf)
  val sqlContext = new SQLContext(sc)
  val hiveSqlContext = new org.apache.spark.sql.hive.HiveContext(sc)&lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 00:54:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141949#M104542</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-08T00:54:49Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141950#M104543</link>
      <description>&lt;P&gt;Actually I noticed there is successfully connect to metastore at the beginning and then later on it tried to connect it again and then failed. &lt;/P&gt;&lt;P&gt;16/09/07 13:14:11 INFO DFSClient:
Created HDFS_DELEGATION_TOKEN token 1297829 for jzhou5 on ha-hdfs:hd0&lt;/P&gt;&lt;P&gt;16/09/07 13:14:12 INFO
metastore: Trying to connect to metastore with URI
thrift://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx&lt;/P&gt;&lt;P&gt;16/09/07 13:14:12 INFO
metastore: Connected to metastore.&lt;/P&gt;&lt;P&gt;16/09/07 13:14:12 INFO Client:
Uploading resource
file:/usr/hdp/2.3.4.0-3485/spark/lib/spark-assembly-1.5.2.2.3.4.0-3485-hadoop2.7.1.2.3.4.0-3485.jar
-&amp;gt; hdfs://hd0/user/.sparkStaging/application_1473231848025_1554/spark-assembly-1.5.2.2.3.4.0-3485-hadoop2.7.1.2.3.4.0-3485.jar&lt;/P&gt;&lt;P&gt;....&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO
HiveContext: Initializing execution hive, version 1.2.1&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO
ClientWrapper: Inspected Hadoop version: 2.7.1.2.3.4.0-3485&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO
ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop
version 2.7.1.2.3.4.0-3485&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO
metastore: Trying to connect to metastore with URI
thrift://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO
metastore: Connected to metastore.&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO
SessionState: Created local directory:
/tmp/2a51b1f7-5c87-4b2a-95c6-bc7eb06d900b_resources&lt;/P&gt;&lt;P&gt;16/09/07 13:14:24 INFO SessionState: Created HDFS directory:
/tmp/hive/jzhou5/2a51b1f7-5c87-4b2a-95c6-&lt;/P&gt;&lt;P&gt;....&lt;/P&gt;&lt;P&gt;16/09/07 13:14:25 INFO metastore:
Trying to connect to metastore with URI
thrift://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx&lt;/P&gt;&lt;P&gt;16/09/07 13:14:25 WARN Hive: Failed
to access metastore. This class should not accessed in runtime.&lt;/P&gt;&lt;P&gt;org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient&lt;/P&gt;&lt;P&gt; 
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)&lt;/P&gt;&lt;P&gt; 
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)&lt;/P&gt;&lt;P&gt; 
at org.apache.hadoop.hive.ql.metadata.Hive.&amp;lt;clinit&amp;gt;(Hive.java:166)&lt;/P&gt;&lt;P&gt; 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)&lt;/P&gt;&lt;P&gt; 
at
org.apache.spark.sql.hive.client.ClientWrapper.&amp;lt;init&amp;gt;(ClientWrapper.scala:193)&lt;/P&gt;&lt;P&gt; 
 at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;/P&gt;&lt;P&gt; 
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;/P&gt;&lt;P&gt; 
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 00:59:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141950#M104543</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-08T00:59:55Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141951#M104544</link>
      <description>&lt;P&gt;Here are the full error output of the job:&lt;/P&gt;&lt;P&gt;16/09/07 14:21:36 WARN Hive: Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
        at org.apache.hadoop.hive.ql.metadata.Hive.&amp;lt;clinit&amp;gt;(Hive.java:166)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at org.apache.spark.sql.hive.client.ClientWrapper.&amp;lt;init&amp;gt;(ClientWrapper.scala:193)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.&amp;lt;init&amp;gt;(IsolatedClientLoader.scala:179)
        at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:228)
        at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:187)
        at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:394)
        at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:176)
        at org.apache.spark.sql.hive.HiveContext.&amp;lt;init&amp;gt;(HiveContext.scala:179)
        at com.cbt.ingest.tsz.TSZIngestApp$delayedInit$body.apply(TSZIngestApp.scala:53)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$anonfun$main$1.apply(App.scala:71)
        at scala.App$anonfun$main$1.apply(App.scala:71)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
        at scala.App$class.main(App.scala:71)
        at com.cbt.ingest.tsz.TSZIngestApp.main(TSZIngestApp.scala:29)
        at com.cbt.ingest.tsz.GenericTSZIngest.main(TSZIngestApp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:685)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:86)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
        ... 34 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        ... 40 more
Caused by: java.lang.IllegalStateException: Error finding hadoop SASL properties
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge23.getHadoopSaslProperties(HadoopThriftAuthBridge23.java:103)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.getMetaStoreSaslProperties(MetaStoreUtils.java:1588)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:401)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:236)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.&amp;lt;init&amp;gt;(SessionHiveMetaStoreClient.java:74)
        ... 45 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge23.getHadoopSaslProperties(HadoopThriftAuthBridge23.java:98)
        ... 49 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.util.StringUtils.toUpperCase(Ljava/lang/String;)Ljava/lang/String;
        at org.apache.hadoop.security.SaslPropertiesResolver.setConf(SaslPropertiesResolver.java:69)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.security.SaslPropertiesResolver.getInstance(SaslPropertiesResolver.java:58)
        ... 54 more
16/09/07 14:21:36 INFO metastore: Trying to connect to metastore with URI thrift://xxxxxxxxxxxxxxxxxxx:9083
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.spark.sql.hive.client.ClientWrapper.&amp;lt;init&amp;gt;(ClientWrapper.scala:193)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.&amp;lt;init&amp;gt;(IsolatedClientLoader.scala:179)
        at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:228)
        at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:187)
        at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:394)
        at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:176)
        at org.apache.spark.sql.hive.HiveContext.&amp;lt;init&amp;gt;(HiveContext.scala:179)
        at com.cbt.ingest.tsz.TSZIngestApp$delayedInit$body.apply(TSZIngestApp.scala:53)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$anonfun$main$1.apply(App.scala:71)
        at scala.App$anonfun$main$1.apply(App.scala:71)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
        at scala.App$class.main(App.scala:71)
        at com.cbt.ingest.tsz.TSZIngestApp.main(TSZIngestApp.scala:29)
        at com.cbt.ingest.tsz.GenericTSZIngest.main(TSZIngestApp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:685)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:86)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        ... 31 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        ... 37 more
Caused by: java.lang.IllegalStateException: Error finding hadoop SASL properties
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge23.getHadoopSaslProperties(HadoopThriftAuthBridge23.java:103)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.getMetaStoreSaslProperties(MetaStoreUtils.java:1588)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:401)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:236)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.&amp;lt;init&amp;gt;(SessionHiveMetaStoreClient.java:74)
        ... 42 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge23.getHadoopSaslProperties(HadoopThriftAuthBridge23.java:98)
        ... 46 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.util.StringUtils.toUpperCase(Ljava/lang/String;)Ljava/lang/String;
        at org.apache.hadoop.security.SaslPropertiesResolver.setConf(SaslPropertiesResolver.java:69)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.security.SaslPropertiesResolver.getInstance(SaslPropertiesResolver.java:58)
        ... 51 more
16/09/07 14:21:36 INFO SparkContext: Invoking stop() from shutdown hook&lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 01:42:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141951#M104544</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-08T01:42:22Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141952#M104545</link>
      <description>&lt;P&gt;Is Kerberos enabled on the cluster?&lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 03:55:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141952#M104545</guid>
      <dc:creator>lgeorge</dc:creator>
      <dc:date>2016-09-08T03:55:36Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141953#M104546</link>
      <description>&lt;P&gt;The problem is resolved by using SQLContext in spark application code. Thanks for quick response. &lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 04:52:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141953#M104546</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-08T04:52:06Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141954#M104547</link>
      <description>&lt;P&gt;Actually the problem still exists since I have to use HiveContext. I just noticed ClientWrapper inspected different hadoop versions, first one is correct and the second one is wrong.. highlighted below. Could this be the root cause?&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO BlockManagerMasterEndpoint:
Registering block manager xxxxxxxx&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO HiveContext: Initializing execution
hive, version 1.2.1&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO ClientWrapper: Inspected Hadoop
version: 2.7.1.2.3.4.0-3485&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO ClientWrapper: Loaded
org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version
2.7.1.2.3.4.0-3485&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO metastore: Trying to connect to
metastore with URI thrift://xxxxxxxxxxxxxxxxxxxxxxxx&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO metastore: Connected to metastore.&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO SessionState: Created local
directory: /tmp/1d43c90d-da99-4970-80fd-31c9ad9a8d4d_resources&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO SessionState: Created HDFS directory:
/tmp/hive/jzhou5/1d43c90d-da99-4970-80fd-31c9ad9a8d4d&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO SessionState: Created local
directory: /tmp/jzhou5/1d43c90d-da99-4970-80fd-31c9ad9a8d4d&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO SessionState: Created HDFS directory:
/tmp/hive/jzhou5/1d43c90d-da99-4970-80fd-31c9ad9a8d4d/_tmp_space.db&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO HiveContext: default warehouse
location is /user/hive/warehouse&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO HiveContext: Initializing
HiveMetastoreConnection version 1.2.1 using Spark classes.&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;16/09/07 15:47:54 INFO ClientWrapper: Inspected Hadoop
version: 2.2.0&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;16/09/07 15:47:54 INFO ClientWrapper: Loaded
org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.2.0&lt;/P&gt;&lt;P&gt;16/09/07 15:47:55 INFO deprecation: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces&lt;/P&gt;&lt;P&gt;16/09/07 15:47:55 INFO deprecation: mapred.min.split.size is
deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize&lt;/P&gt;</description>
      <pubDate>Fri, 09 Sep 2016 10:58:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141954#M104547</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-09T10:58:27Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141955#M104548</link>
      <description>&lt;P&gt;I think there is a definitely a clash of versions.  The reflection error below indicates a mismatch of versions when the client is creating a session:&lt;/P&gt;&lt;P&gt;more Caused by: java.lang.NoSuchMethodError: &lt;STRONG&gt;org.apache.hadoop.util.StringUtils.toUpperCase(Ljava/lang/String;)Ljava/lang/String; at org.apache.hadoop.security.SaslPropertiesResolver.setConf(SaslPropertiesResolver.java:69)&lt;/STRONG&gt; at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.security.SaslPropertiesResolver.getInstance(SaslPropertiesResolver.java:58) ... 54 more 16/09/07 14:21:36 INFO metastore: Trying to connect to metastore with URI thrift://xxxxxxxxxxxxxxxxxxx:9083 Exception in thread "main"&lt;/P&gt;&lt;P&gt;Check the contents of the jars to make sure they are all compatible.  For example what is the contents of target/YOUR_JAR-1.0.0-SNAPSHOT.jar&lt;/P&gt;</description>
      <pubDate>Fri, 09 Sep 2016 23:30:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141955#M104548</guid>
      <dc:creator>Carolyn</dc:creator>
      <dc:date>2016-09-09T23:30:30Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141956#M104549</link>
      <description>&lt;P&gt;I built it in my local windows env. I noticed the hadoop version is 2.2 the jars are automatically downloaded by Maven build. Where to set the version, I didn't see it in my pom.xml&lt;/P&gt;&lt;P&gt;Here is my partial content of pom.xml&lt;/P&gt;&lt;P&gt;&amp;lt;properties&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;project.build.sourceEncoding&amp;gt;UTF-8&amp;lt;/project.build.sourceEncoding&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;project.reporting.outputEncoding&amp;gt;UTF-8&amp;lt;/project.reporting.outputEncoding&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;!-- Component versions are defined here
--&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;hadoop.version&amp;gt;2.7.1&amp;lt;/hadoop.version&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;spark.version&amp;gt;1.5.2&amp;lt;/spark.version&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;avro.version&amp;gt;1.8.1&amp;lt;/avro.version&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;log4j.version&amp;gt;1.2.17&amp;lt;/log4j.version&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;scala.version&amp;gt;2.10.6&amp;lt;/scala.version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/properties&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;pluginRepositories&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;lt;pluginRepository&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;id&amp;gt;&lt;U&gt;scala&lt;/U&gt;-tools.org&amp;lt;/id&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;name&amp;gt;&lt;U&gt;Scala&lt;/U&gt;-tools
Maven2 Repository&amp;lt;/name&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;url&amp;gt;http://scala-tools.org/repo-releases&amp;lt;/url&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/pluginRepository&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/pluginRepositories&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependencies&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;artifactId&amp;gt;spark-core_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;version&amp;gt;${spark.version}&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;artifactId&amp;gt;spark-sql_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;${spark.version}&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;groupId&amp;gt;com.datastax.spark&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;artifactId&amp;gt;spark-&lt;U&gt;cassandra&lt;/U&gt;-connector_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;1.5.1&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;artifactId&amp;gt;spark-hive_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;${spark.version}&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;com.databricks&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;artifactId&amp;gt;spark-csv_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt;   &amp;lt;version&amp;gt;1.4.0&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;com.databricks&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;artifactId&amp;gt;spark-xml_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;0.3.3&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt;   &amp;lt;groupId&amp;gt;com.databricks&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;artifactId&amp;gt;spark-avro_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;2.0.1&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;com.google.guava&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;artifactId&amp;gt;&lt;U&gt;guava&lt;/U&gt;&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;version&amp;gt;18.0&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;org.scalikejdbc&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;artifactId&amp;gt;scalikejdbc_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;2.4.2&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;artifactId&amp;gt;spark-mllib_2.10&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;version&amp;gt;${spark.version}&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;/dependency&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;dependency&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;groupId&amp;gt;org.apache.hive&amp;lt;/groupId&amp;gt;&lt;/P&gt;&lt;P&gt;  &amp;lt;artifactId&amp;gt;hive-&lt;U&gt;jdbc&lt;/U&gt;&amp;lt;/artifactId&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;version&amp;gt;1.2.1&amp;lt;/version&amp;gt;&lt;/P&gt;&lt;P&gt; 
&amp;lt;/dependency&amp;gt;&lt;/P&gt;</description>
      <pubDate>Sat, 10 Sep 2016 02:34:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141956#M104549</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-10T02:34:06Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141957#M104550</link>
      <description>&lt;P&gt;I agree with @cduby that there is a version conflict between the used hadoop library and what Spark is actually expecting. The best way to find such a problem is to use the dependency:tree ability of Maven in combination with the artifact that contains the problematic class. In this way, you can find which transitive dependencies are getting fetched by your Spark application by default. &lt;/P&gt;&lt;P&gt;So, I had exactly the same problem and in order to solve it I followed the following process. &lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Find in which artifact the org.apache.hadoop.util.StringUtils class belongs to. This is the hadoop-commons library.&lt;/LI&gt;&lt;LI&gt;Then execute mvn dependency:tree to find out what version of this jar is fetched by default by Spark (note that the automatic dependency resolution happens only in the case that you haven't already provided the hadoop libraries yourself. In my case, these were the 2.2 versions of Apache hadoop-common.&lt;/LI&gt;&lt;LI&gt;Then find the right version of the library that contains the correct version of StringUtils. This can be quite difficult but in my case, I happened to know it from other projects and this was the 2.6.1 version.&lt;/LI&gt;&lt;LI&gt;Provide that dependency in your pom.xml, before the definition of the Spark dependency, so that it takes precedence over the transitive dependency of Spark.&lt;/LI&gt;&lt;LI&gt;Then it should work.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;The following hadoop-common dependency solved the problem for me. &lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;PRE&gt;&amp;lt;dependency&amp;gt;
	&amp;lt;groupId&amp;gt;org.apache.hadoop&amp;lt;/groupId&amp;gt;
        &amp;lt;artifactId&amp;gt;hadoop-common&amp;lt;/artifactId&amp;gt;
        &amp;lt;version&amp;gt;2.6.1&amp;lt;/version&amp;gt;
&amp;lt;/dependency&amp;gt;
&amp;lt;!-- Spark dependencies --&amp;gt;
&amp;lt;dependency&amp;gt;
...&lt;/PRE&gt;&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Mon, 26 Sep 2016 23:06:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141957#M104550</guid>
      <dc:creator>georgios_gkekas</dc:creator>
      <dc:date>2016-09-26T23:06:10Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141958#M104551</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/12972/jay2zhou.html" nodeid="12972"&gt;@Jay Zhou&lt;/A&gt; and &lt;A rel="user" href="https://community.cloudera.com/users/10085/georgiosgkekas.html" nodeid="10085"&gt;@Georgios Gkekas&lt;/A&gt; Also check out this article on how to use the artifacts in the Hortonworks repository from Maven.  It is for building streaming applications but can should be able to translate to other Spark applications:&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/articles/30430/a-maven-pomxml-for-java-based-sparkstreaming-appli.html" target="_blank"&gt;https://community.hortonworks.com/articles/30430/a-maven-pomxml-for-java-based-sparkstreaming-appli.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 26 Sep 2016 23:29:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141958#M104551</guid>
      <dc:creator>Carolyn</dc:creator>
      <dc:date>2016-09-26T23:29:34Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141959#M104552</link>
      <description>&lt;P&gt;Thanks. Yes. that is what I did. I have resolved this issue a few weeks ago... sorry to update late.&lt;/P&gt;</description>
      <pubDate>Mon, 26 Sep 2016 23:31:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141959#M104552</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-26T23:31:36Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141960#M104553</link>
      <description>&lt;P&gt;Glad you got it working. &lt;/P&gt;</description>
      <pubDate>Mon, 26 Sep 2016 23:33:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141960#M104553</guid>
      <dc:creator>Carolyn</dc:creator>
      <dc:date>2016-09-26T23:33:55Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141961#M104554</link>
      <description>&lt;P&gt;Thanks for helping.&lt;/P&gt;</description>
      <pubDate>Tue, 27 Sep 2016 01:21:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141961#M104554</guid>
      <dc:creator>jay2_zhou</dc:creator>
      <dc:date>2016-09-27T01:21:03Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job failed when new HiveContext object</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141962#M104555</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/12972/jay2zhou.html" nodeid="12972"&gt;@Jay Zhou&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Can you be a bit more specific what you have changed? What did you exactly do with this line?&lt;/P&gt;&lt;PRE&gt;val hiveSqlContext = new org.apache.spark.sql.hive.HiveContext(sc)&lt;/PRE&gt;&lt;P&gt;I have a similar problem where I get an error &lt;/P&gt;&lt;P&gt;WARN Hive: Failed to access metastore. This class should not accessed in runtime.&lt;/P&gt;&lt;P&gt;but this is only when I run the job via Oozie. When I use spark submit the code works so I guess the dependencies are right.&lt;/P&gt;&lt;P&gt;Do you have any idea what can cause this?&lt;/P&gt;</description>
      <pubDate>Thu, 20 Apr 2017 12:51:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-failed-when-new-HiveContext-object/m-p/141962#M104555</guid>
      <dc:creator>jiiiiken88</dc:creator>
      <dc:date>2017-04-20T12:51:54Z</dc:date>
    </item>
  </channel>
</rss>

