Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

CDP - spark-shell --master yarn : security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration

avatar
Explorer

When I try to start a spark-shell with YARN I'm getting this error:

 

22/07/29 15:36:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/07/29 15:36:08 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
22/07/29 15:36:09 WARN security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration
java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.hbaseConf(HBaseDelegationTokenProvider.scala:117)
at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.delegationTokensRequired(HBaseDelegationTokenProvider.scala:110)
at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:165)
at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:164)

 

How can I fix this issue?

 

 

1 ACCEPTED SOLUTION

avatar
Master Collaborator

Hi @paulo_klein,  Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase DelegationToken.

 

To fix this issue you start spark-shell or pyspark or spark-submit via
--conf spark.security.credentials.hbase.enabled=false

Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false

View solution in original post

1 REPLY 1

avatar
Master Collaborator

Hi @paulo_klein,  Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase DelegationToken.

 

To fix this issue you start spark-shell or pyspark or spark-submit via
--conf spark.security.credentials.hbase.enabled=false

Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false