Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Lauch spark job from oozie shell-action in kerberized cluster w/o keytab

avatar
Expert Contributor

Hello community,

I've a cluster secured with one way trust relationship with an AD, before enable security in the cluster I was able to excute spark via oozie using a shell-action.

Is there a way to keep doing it without have to propagate my keytab in every nodemamanager? I've seen that for hive from shell-action you can pass the HADOOP_TOKEN_FILE_LOCATION variable to use it, can I do something similar with spark? if not what alternatives do I have?

The problem with the keytab is that I've to change the password every moth so I would have to copy the keytab everytime it changes...

Thank you in advance.

1 ACCEPTED SOLUTION

avatar
Contributor

You can upload your keytab file to workflow lib folder so that the keytab will copy to the container folder no matter the job is running on which nodemanager.

Then you can specify the --keytab your-keytab --principal your-principal in your spark-submit command.

But you have to upload the update keytab to workflow lib folder every time you change the password.

View solution in original post

2 REPLIES 2

avatar
Expert Contributor

@Juan Manuel Nieto Yes, if its kerberized environment you need to provide the keytab to authenticate. Since you are using shell-action you can use kinit too.

avatar
Contributor

You can upload your keytab file to workflow lib folder so that the keytab will copy to the container folder no matter the job is running on which nodemanager.

Then you can specify the --keytab your-keytab --principal your-principal in your spark-submit command.

But you have to upload the update keytab to workflow lib folder every time you change the password.