Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to submit spark jobs from Data Nodes after enabling kerberos

avatar
Contributor

Hi Team,

I have enabled kerberos on my cluster. followed below steps.

1) Installed kerberos in one node which can be accessible from all cluster nodes.

2) Enabled kerberos from Ambari by giving details of previously created kerberos server.

Every thing went fine. But when i try to run spark-submit (word count spark job) from any of data node i am getting below error.

"Invalid credentials, no valid tgt found."

I tried with all users, hdfs, spark and Yarn (sudo -u hdfs) etc. but no luck.

But when i run the same from master node the job is getting executed.

What should i do to be able to run spark/hive/any other jobs from any of the data node/edge node?

Please advice.

Thanks in advance.

1 ACCEPTED SOLUTION

avatar
Super Guru

@Paramesh malla,

You need to run the kinit command before running the job. For example, to do kinit with spark keyta, follow below steps

1) Get the principal

klist -kt /etc/security/keytabs/spark.headless.keytab

2) Do kinit

kinit -kt /etc/security/keytabs/spark.headless.keytab <principal-from-1st-command>

.

Run the job after running the kinit command and it should not give "Invalid credentials, no valid tgt found.". Hope this helps 🙂

Please "Accept" the answer if this helps.

.

-Aditya

View solution in original post

1 REPLY 1

avatar
Super Guru

@Paramesh malla,

You need to run the kinit command before running the job. For example, to do kinit with spark keyta, follow below steps

1) Get the principal

klist -kt /etc/security/keytabs/spark.headless.keytab

2) Do kinit

kinit -kt /etc/security/keytabs/spark.headless.keytab <principal-from-1st-command>

.

Run the job after running the kinit command and it should not give "Invalid credentials, no valid tgt found.". Hope this helps 🙂

Please "Accept" the answer if this helps.

.

-Aditya