Created 07-17-2018 03:51 PM
Hi Team,
I have enabled kerberos on my cluster. followed below steps.
1) Installed kerberos in one node which can be accessible from all cluster nodes.
2) Enabled kerberos from Ambari by giving details of previously created kerberos server.
Every thing went fine. But when i try to run spark-submit (word count spark job) from any of data node i am getting below error.
"Invalid credentials, no valid tgt found."
I tried with all users, hdfs, spark and Yarn (sudo -u hdfs) etc. but no luck.
But when i run the same from master node the job is getting executed.
What should i do to be able to run spark/hive/any other jobs from any of the data node/edge node?
Please advice.
Thanks in advance.
Created 07-17-2018 05:24 PM
You need to run the kinit command before running the job. For example, to do kinit with spark keyta, follow below steps
1) Get the principal
klist -kt /etc/security/keytabs/spark.headless.keytab
2) Do kinit
kinit -kt /etc/security/keytabs/spark.headless.keytab <principal-from-1st-command>
.
Run the job after running the kinit command and it should not give "Invalid credentials, no valid tgt found.". Hope this helps 🙂
Please "Accept" the answer if this helps.
.
-Aditya
Created 07-17-2018 05:24 PM
You need to run the kinit command before running the job. For example, to do kinit with spark keyta, follow below steps
1) Get the principal
klist -kt /etc/security/keytabs/spark.headless.keytab
2) Do kinit
kinit -kt /etc/security/keytabs/spark.headless.keytab <principal-from-1st-command>
.
Run the job after running the kinit command and it should not give "Invalid credentials, no valid tgt found.". Hope this helps 🙂
Please "Accept" the answer if this helps.
.
-Aditya