Support Questions

Find answers, ask questions, and share your expertise

Spark-Hive Application: SASL Negotiation Failure with Kerberos on a Cluster

avatar
Explorer
I'm having an issue with a Spark-Hive application running on a Kerberos cluster. I receive a javax.security.sasl.SaslException: GSS initiate failed error, which appears to be caused by not finding any Kerberos tgt.
 
Here's the error log:
 
    23/08/04 22:56:55 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
    23/08/04 22:56:55 INFO HiveClientImpl: Attempting to login to Kerberos using principal: hdfs01@HDP.COM and keytab: hdfs01.keytab-2ca1f730-bef7-4166-90ce-67317c75c793
    23/08/04 22:56:55 INFO UserGroupInformation: Login successful for user hdfs01@HDP.COM using keytab file hdfs01.keytab-2ca1f730-bef7-4166-90ce-67317c75c793
    23/08/04 22:56:55 INFO metastore: Trying to connect to metastore with URI thrift://master3.abc.xyz.com:9083"
    23/08/04 22:56:55 ERROR TSaslTransport: SASL negotiation failure
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apac...
 
I am submitting my Spark job as follows:
 
    spark-submit \
    --name TestKerberous \
    --num-executors 2 \
    --driver-java-options "-Djava.security.auth.login.config=./key_fin.conf" \
    --driver-java-options "-Dsun.security.krb5.debug=true" \
    --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./key_fin.conf"\
    --files=/etc/spark/conf/hive-site.xml,/etc/hadoop/conf/yarn-site.xml,/etc/hadoop/conf/hdfs-site.xml,/etc/hadoop/conf/core-site.xml \
    --conf "spark.hadoop.hive.metastore.kerberos.principal=HTTP/_HOST@HDP.COM" \
    --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./key.conf" \
    --conf -Djavax.security.auth.useSubjectCredsOnly=false \
    --conf spark.executorEnv.KRB5_CONFIG=/etc/krb5.conf \
    --conf spark.driverEnv.KRB5_CONFIG=/etc/krb5.conf \
    --conf "spark.hadoop.hive.metastore.sasl.enabled=true" \
    --conf "spark.hadoop.hive.security.authorization.enabled=true" \
    --conf "spark.hadoop.hive.metastore.execute.setugi=true" \
    --conf spark.sql.hive.convertMetastoreParquet=false \
    --conf spark.home=/usr/hdp/current/spark2-client \
    --conf spark.sql.warehouse.dir=/apps/hive/warehouse \
    --conf spark.sql.catalogImplementation=hive \
    --conf spark.yarn.keytab=/etc/security/keytabs/hdfs01.keytab \
    --conf spark.yarn.principal=hdfs01@HDP.COM \
    --conf spark.serializer=org.apache.spark.serializer.KryoSerializer  \
    --master yarn --deploy-mode cluster --driver-cores 2 --driver-memory 2G --executor-cores 2 --executor-memory 2G --supervise \
    --class <CLASS_NAME> \
    <JAR_FILE>\
    "<Hive Jdbc Url>" "thrift://master3.abc.xyz.com:9083" "/apps/hive/warehouse"
 
 
I would really appreciate it if anyone could help me diagnose what might be going wrong and how to resolve this issue.
 
Thank you in advance for any insights you can provide
2 ACCEPTED SOLUTIONS

avatar
Master Collaborator

Hi @Rohan44 

 

Could you please test the above application just by specifying keytab and principal and removing the other security related parameters in spark-submit?

View solution in original post

avatar
Explorer

@RangaReddy  Thanks ,It was server level issue, I tried with different edge not and it worked,

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

Hi @Rohan44 

 

Could you please test the above application just by specifying keytab and principal and removing the other security related parameters in spark-submit?

avatar
Explorer

@RangaReddy  Thanks ,It was server level issue, I tried with different edge not and it worked,