Support Questions
Find answers, ask questions, and share your expertise

spark-submit failing to connect to metastore due to Kerberos : Caused by GSSException: No valid credentials provided . but works in local-client mode

New Contributor

it seems, in docker pyspark (2.3.0) shell in local-client mode is working and able to connect to hive. However, issuing spark-submit with all dependencies it fails with below error:

20/08/24 14:03:01 INFO storage.BlockManagerMasterEndpoint: Registering block manager with 6.2 GB RAM, BlockManagerId(3,, 41697, None)
20/08/24 14:03:02 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
20/08/24 14:03:02 INFO hive.metastore: Trying to connect to metastore with URI thrift://
20/08/24 14:03:02 ERROR transport.TSaslTransport: SASL negotiation failure GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]        at        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(        at

Running a simple pi example on spark-submit yarn-cluster mode through pysparkscript works fine with no kerberos issues, but when trying to access hive metastore getting kerberos error.


Spark-submit command:

spark-submit --master yarn --deploy-mode cluster --files=/etc/hive/conf/hive-site.xml,/etc/hive/conf/yarn-site.xml,/etc/hive/conf/hdfs-site.xml,/etc/hive/conf/core-site.xml,/etc/hive/conf/mapred-site.xml,/etc/hive/conf/ssl-client.xml  --name fetch_hive_test --executor-memory 12g --num-executors 20 is a simple pyspark script to show tables in test db:

from pyspark.sql import SparkSession
appName = "test_hive_minimal"
master = "yarn" sc = SparkSession.builder \ .appName(appName) \ .master(master) \ .enableHiveSupport() \ .config("spark.hadoop.hive.enforce.bucketing", "True") \ .config("", "none") \ .config("hive.exec.dynamic.partition", "True") \ .config("hive.exec.dynamic.partition.mode", "nonstrict") \ .getOrCreate() sql = "show tables in user_tables"
df_new = sc.sql(sql)


Can anyone throw some light how to fix this?the keytab is fine because hadoop can be accessed from the docker terminal. Isnt kerberos tickets managed automatically by yarn? I tried passing keytab and principal but it did not help. What seems to be the issue here?


this is cdh5.13 with spark 2.3