<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark + LLAP problems after upgrade to HDP 3.0 in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215157#M82351</link>
    <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/13196/berryosterlund.html" nodeid="13196"&gt;@Berry Österlund&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;What setting do you have for &lt;EM&gt;spark.security.credentials.hiveserver2.enabled&lt;/EM&gt;?&lt;/P&gt;&lt;P&gt;Please try setting it to &lt;EM&gt;false&lt;/EM&gt; for client-mode on kerberized cluster.&lt;/P&gt;&lt;P&gt;Also make sure this is set, if it is not already, &lt;EM&gt;spark.sql.hive.hiveserver2.jdbc.url.principal&lt;/EM&gt;.&lt;/P&gt;</description>
    <pubDate>Sat, 18 Aug 2018 01:24:01 GMT</pubDate>
    <dc:creator>ewohlstadter</dc:creator>
    <dc:date>2018-08-18T01:24:01Z</dc:date>
    <item>
      <title>Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215154#M82348</link>
      <description>&lt;P&gt;I’m
upgrading one of our clusters right now to HDP 3.0 and the upgrade itself
worked fine. But after the upgrade, I just can’t get Spark with LLAP to work. This
is not a new feature for us, as we have been using this for as long as the
support have been there.&lt;/P&gt;&lt;P&gt;As there is
some changes in the configuration, I’ve followed and change the config
according to both&lt;BR /&gt;
&lt;A href="https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/integrating-hive/content/hive_hivewarehouseconnector_for_handling_apache_spark_data.html"&gt;https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/integrating-hive/content/hive_hivewarehouseconnector_for_handling_apache_spark_data.html&lt;/A&gt;&lt;BR /&gt;
and&lt;BR /&gt;
&lt;A href="https://github.com/hortonworks-spark/spark-llap/tree/master"&gt;https://github.com/hortonworks-spark/spark-llap/tree/master&lt;/A&gt;&lt;/P&gt;&lt;P&gt;The
testcode I’m running is the following&lt;/P&gt;&lt;P&gt;spark-shell
--master yarn --deploy-mode client --jars
/usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar&lt;/P&gt;&lt;PRE&gt;import com.hortonworks.hwc.HiveWarehouseSession
import com.hortonworks.hwc.HiveWarehouseSession._
val hive = HiveWarehouseSession.session(spark).build()
hive.showDatabases().show(100)&lt;/PRE&gt;&lt;P&gt;The error I
get is the following.&lt;/P&gt;&lt;PRE&gt;java.lang.RuntimeException:
java.sql.SQLException: Cannot create PoolableConnectionFactory (Could not open
client transport with JDBC Uri: jdbc:hive2://&amp;lt;server&amp;gt;:10501/;transportMode=http;httpPath=cliservice;auth=delegationToken:
Could not establish connection to jdbc:hive2:// &amp;lt;server&amp;gt;:10501/;transportMode=http;httpPath=cliservice;auth=delegationToken:
HTTP Response code: 401)&lt;/PRE&gt;&lt;P&gt;The Hive
server show the following&lt;/P&gt;&lt;PRE&gt;2018-08-17T07:28:50,759
INFO  [HiveServer2-HttpHandler-Pool:
Thread-175]: thrift.ThriftHttpServlet (ThriftHttpServlet.java:doPost(146)) -
Could not validate cookie sent, will try to generate a new cookie&lt;BR /&gt;2018-08-17T07:28:50,759
INFO  [HiveServer2-HttpHandler-Pool:
Thread-175]: thrift.ThriftHttpServlet
(ThriftHttpServlet.java:doKerberosAuth(399)) - Failed to authenticate with
http/_HOST kerberos principal, trying with hive/_HOST kerberos principal&lt;BR /&gt;2018-08-17T07:28:50,760
ERROR [HiveServer2-HttpHandler-Pool: Thread-175]: thrift.ThriftHttpServlet
(ThriftHttpServlet.java:doKerberosAuth(407)) - Failed to authenticate with
hive/_HOST kerberos principal&lt;BR /&gt;2018-08-17T07:28:50,760
ERROR [HiveServer2-HttpHandler-Pool: Thread-175]: thrift.ThriftHttpServlet
(ThriftHttpServlet.java:doPost(210)) - Error:&lt;BR /&gt;org.apache.hive.service.auth.HttpAuthenticationException:
java.lang.reflect.UndeclaredThrowableException&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:408)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:160)
[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  at
javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
[javax.servlet-api-3.1.0.jar:3.1.0]&lt;BR /&gt;  at
javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
[javax.servlet-api-3.1.0.jar:3.1.0]&lt;BR /&gt;  at
org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:584)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:224)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.Server.handle(Server.java:534) [jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
[jetty-io-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108)
[jetty-io-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
[jetty-io-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
[jetty-runner-9.3.20.v20170531.jar:9.3.20.v20170531]&lt;BR /&gt;  at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[?:1.8.0_112]&lt;BR /&gt;  at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[?:1.8.0_112]&lt;BR /&gt;  at java.lang.Thread.run(Thread.java:745)
[?:1.8.0_112]&lt;BR /&gt;Caused by:
java.lang.reflect.UndeclaredThrowableException&lt;BR /&gt;  at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1706)
~[hadoop-common-3.1.0.3.0.0.0-1634.jar:?]&lt;BR /&gt;  at org.apache.hive.service.cli.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:405)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  ... 25 more&lt;BR /&gt;Caused by:
org.apache.hive.service.auth.HttpAuthenticationException: Kerberos
authentication failed:&lt;BR /&gt;  at org.apache.hive.service.cli.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:464)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:413)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  at
java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]&lt;BR /&gt;  at
javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112]&lt;BR /&gt;   at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)
~[hadoop-common-3.1.0.3.0.0.0-1634.jar:?]&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:405)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  ... 25 more&lt;BR /&gt;Caused by:
org.ietf.jgss.GSSException: Defective token detected (Mechanism level:
GSSHeader did not find the right tag)&lt;BR /&gt;  at
sun.security.jgss.GSSHeader.&amp;lt;init&amp;gt;(GSSHeader.java:97) ~[?:1.8.0_112]&lt;BR /&gt;  at
sun.security.jgss.GSSContextImpl.acceptSecContext(GSSContextImpl.java:306)
~[?:1.8.0_112]&lt;BR /&gt;  at
sun.security.jgss.GSSContextImpl.acceptSecContext(GSSContextImpl.java:285)
~[?:1.8.0_112]&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:452)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:413)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  at
java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]&lt;BR /&gt;  at
javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112]&lt;BR /&gt;  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)
~[hadoop-common-3.1.0.3.0.0.0-1634.jar:?]&lt;BR /&gt;  at
org.apache.hive.service.cli.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:405)
~[hive-service-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]&lt;BR /&gt;  ... 25 more &lt;/PRE&gt;&lt;P&gt;I can see
that it complains about the Kerberos ticket, but I do have a valid key in my
session. Running any other Kerberos access like beeline works fine from the
same session. &lt;/P&gt;&lt;P&gt;Does
anybody have any clue about this error?&lt;/P&gt;</description>
      <pubDate>Fri, 17 Aug 2018 15:51:04 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215154#M82348</guid>
      <dc:creator>berry_osterlund</dc:creator>
      <dc:date>2018-08-17T15:51:04Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215155#M82349</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/13196/berryosterlund.html" nodeid="13196"&gt;@Berry Österlund&lt;/A&gt; &lt;/P&gt;&lt;P&gt;I think the problem might be related to some missing configurations, please check you have set all as per:&lt;/P&gt;&lt;P&gt;&lt;A href="https://github.com/hortonworks-spark/spark-llap" target="_blank"&gt;https://github.com/hortonworks-spark/spark-llap&lt;/A&gt;&lt;/P&gt;&lt;P&gt;HTH&lt;/P&gt;</description>
      <pubDate>Fri, 17 Aug 2018 18:42:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215155#M82349</guid>
      <dc:creator>falbani</dc:creator>
      <dc:date>2018-08-17T18:42:08Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215156#M82350</link>
      <description>&lt;P&gt;Thanks for the answer. But I have verified those setting atleast ten times now, and they are correct as far as I can see. This cluster worked with Spark + LLAP (even in Livy) with HDP 2.6.5, and most of these settings are the same. &lt;/P&gt;</description>
      <pubDate>Fri, 17 Aug 2018 19:02:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215156#M82350</guid>
      <dc:creator>berry_osterlund</dc:creator>
      <dc:date>2018-08-17T19:02:24Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215157#M82351</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/13196/berryosterlund.html" nodeid="13196"&gt;@Berry Österlund&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;What setting do you have for &lt;EM&gt;spark.security.credentials.hiveserver2.enabled&lt;/EM&gt;?&lt;/P&gt;&lt;P&gt;Please try setting it to &lt;EM&gt;false&lt;/EM&gt; for client-mode on kerberized cluster.&lt;/P&gt;&lt;P&gt;Also make sure this is set, if it is not already, &lt;EM&gt;spark.sql.hive.hiveserver2.jdbc.url.principal&lt;/EM&gt;.&lt;/P&gt;</description>
      <pubDate>Sat, 18 Aug 2018 01:24:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215157#M82351</guid>
      <dc:creator>ewohlstadter</dc:creator>
      <dc:date>2018-08-18T01:24:01Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215158#M82352</link>
      <description>&lt;P&gt;Setting &lt;EM&gt;spark.security.credentials.hiveserver2.enabled&lt;/EM&gt; to &lt;EM&gt;false &lt;/EM&gt;solved the problem. I can now use spark with LLAP in both Java and Python. Just R missing now. Will try to find out how to do it there aswell. Thanks for the help!&lt;/P&gt;</description>
      <pubDate>Mon, 20 Aug 2018 16:03:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215158#M82352</guid>
      <dc:creator>berry_osterlund</dc:creator>
      <dc:date>2018-08-20T16:03:39Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215159#M82353</link>
      <description>&lt;P&gt;where is the setting "&lt;EM&gt;spark.security.credentials.hiveserver2.enabled" updated, Spark config in Ambari or Hive?&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 15 Dec 2018 04:24:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215159#M82353</guid>
      <dc:creator>Chandra</dc:creator>
      <dc:date>2018-12-15T04:24:00Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215160#M82354</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/11008/chandramoulimuthukumaran.html" nodeid="11008"&gt;@chandramouli muthukumaran&lt;/A&gt;&lt;/P&gt;&lt;P&gt;This is updated in the Spark2 config&lt;/P&gt;,&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/11008/chandramoulimuthukumaran.html" nodeid="11008"&gt;@chandramouli muthukumaran&lt;/A&gt;&lt;BR /&gt; &lt;/P&gt;&lt;P&gt;This is updated in Spark2 config.&lt;/P&gt;</description>
      <pubDate>Sat, 15 Dec 2018 04:32:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215160#M82354</guid>
      <dc:creator>ewohlstadter</dc:creator>
      <dc:date>2018-12-15T04:32:28Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215161#M82355</link>
      <description>&lt;P&gt;Thanks &lt;A href="https://community.hortonworks.com/users/48334/ewohlstadter.html"&gt;Eric Wohlstadter&lt;/A&gt;. Do we add this as a custom spark2 defaults propoerty in the config&lt;/P&gt;</description>
      <pubDate>Sat, 15 Dec 2018 04:39:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215161#M82355</guid>
      <dc:creator>Chandra</dc:creator>
      <dc:date>2018-12-15T04:39:16Z</dc:date>
    </item>
    <item>
      <title>Re: Spark + LLAP problems after upgrade to HDP 3.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215162#M82356</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/11008/chandramoulimuthukumaran.html" nodeid="11008"&gt;@chandramouli muthukumaran&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Yes, that's right.&lt;/P&gt;</description>
      <pubDate>Sun, 16 Dec 2018 02:37:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-LLAP-problems-after-upgrade-to-HDP-3-0/m-p/215162#M82356</guid>
      <dc:creator>ewohlstadter</dc:creator>
      <dc:date>2018-12-16T02:37:24Z</dc:date>
    </item>
  </channel>
</rss>

