Created 07-29-2022 11:39 AM
When I try to start a spark-shell with YARN I'm getting this error:
22/07/29 15:36:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/07/29 15:36:08 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
22/07/29 15:36:09 WARN security.HBaseDelegationTokenProvider: Fail to invoke HBaseConfiguration
java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.hbaseConf(HBaseDelegationTokenProvider.scala:117)
at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.delegationTokensRequired(HBaseDelegationTokenProvider.scala:110)
at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:165)
at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:164)
How can I fix this issue?
Created 07-29-2022 06:51 PM
Hi @paulo_klein, Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase DelegationToken.
To fix this issue you start spark-shell or pyspark or spark-submit via
--conf spark.security.credentials.hbase.enabled=false
Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false
Created 07-29-2022 06:51 PM
Hi @paulo_klein, Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase DelegationToken.
To fix this issue you start spark-shell or pyspark or spark-submit via
--conf spark.security.credentials.hbase.enabled=false
Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false