<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: CDP - spark-shell --master yarn : security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/CDP-spark-shell-master-yarn-security/m-p/349117#M235531</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/53847"&gt;@paulo_klein&lt;/a&gt;,&amp;nbsp; Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase&amp;nbsp;&lt;SPAN&gt;DelegationToken.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;To fix this issue you start spark-shell or pyspark or spark-submit via&lt;BR /&gt;--conf spark.security.credentials.hbase.enabled=false&lt;BR /&gt;&lt;BR /&gt;Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false&lt;/P&gt;</description>
    <pubDate>Sat, 30 Jul 2022 01:51:31 GMT</pubDate>
    <dc:creator>jagadeesan</dc:creator>
    <dc:date>2022-07-30T01:51:31Z</dc:date>
    <item>
      <title>CDP - spark-shell --master yarn : security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/CDP-spark-shell-master-yarn-security/m-p/349105#M235524</link>
      <description>&lt;P&gt;When I try to start a spark-shell with YARN I'm getting this error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;22/07/29 15:36:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable&lt;BR /&gt;22/07/29 15:36:08 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.&lt;BR /&gt;&lt;FONT color="#FF6600"&gt;&lt;STRONG&gt;22/07/29 15:36:09 WARN security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration&lt;/STRONG&gt;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#FF6600"&gt;&lt;STRONG&gt;java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration&lt;/STRONG&gt;&lt;/FONT&gt;&lt;BR /&gt;at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)&lt;BR /&gt;at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)&lt;BR /&gt;at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)&lt;BR /&gt;at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.hbaseConf(HBaseDelegationTokenProvider.scala:117)&lt;BR /&gt;at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.delegationTokensRequired(HBaseDelegationTokenProvider.scala:110)&lt;BR /&gt;at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:165)&lt;BR /&gt;at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:164)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How can I fix this issue?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 29 Jul 2022 18:39:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/CDP-spark-shell-master-yarn-security/m-p/349105#M235524</guid>
      <dc:creator>paulo_klein</dc:creator>
      <dc:date>2022-07-29T18:39:51Z</dc:date>
    </item>
    <item>
      <title>Re: CDP - spark-shell --master yarn : security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration</title>
      <link>https://community.cloudera.com/t5/Support-Questions/CDP-spark-shell-master-yarn-security/m-p/349117#M235531</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/53847"&gt;@paulo_klein&lt;/a&gt;,&amp;nbsp; Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase&amp;nbsp;&lt;SPAN&gt;DelegationToken.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;To fix this issue you start spark-shell or pyspark or spark-submit via&lt;BR /&gt;--conf spark.security.credentials.hbase.enabled=false&lt;BR /&gt;&lt;BR /&gt;Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false&lt;/P&gt;</description>
      <pubDate>Sat, 30 Jul 2022 01:51:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/CDP-spark-shell-master-yarn-security/m-p/349117#M235531</guid>
      <dc:creator>jagadeesan</dc:creator>
      <dc:date>2022-07-30T01:51:31Z</dc:date>
    </item>
  </channel>
</rss>

