Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Spark Structured Streaming job fail to authenticate from multiple kerberos servers failing with spark-submit client deployment mode

New Contributor

We are trying to execute a spark structured streaming job that reads from Kafka and writes to HDFS. Spark on Yarn authentication is from one Kerberos server and Kafka is in another cluster authenticated by a different kerberos server.

In cluster mode, the same spark-submit works fine however in client mode it fails with the below error.

Command ./spark-submit --verbose --num-executors 4 --master yarn --deploy-mode client --driver-cores 2 --executor-cores 2 --executor-memory 1g --driver-memory 1g --conf "" --files "/root/work/kf_client_jaas1.conf,/root/work/demo-spark/spark-hdp261/direp_lab.jks" --conf "spark.hadoop.yarn.client.failover-proxy-provider=org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider" --class org.xxxxx.spark.streaming.transform.KrbDebug "/root/work/spark-2.4.0-bin-hadoop2.7/jars/sparkstreaming-xxxxx-2.4.0_poc.jar"


Caused by: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user at at at at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( at sun.reflect.DelegatingMethodAccessorImpl.invoke( at java.lang.reflect.Method.invoke( at at$000( at$ at$ at Method) at at at at at at at

I am doing a kinit before the spark submit for the spark on yarn authentication.Can someone help in understanding the issue with this execution?


Master Guru

@Sreenath I think this is because of the where the secure connection is not available yet. 

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.