Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Lauch spark job from oozie shell-action in kerberized cluster w/o keytab

Solved Go to solution

Lauch spark job from oozie shell-action in kerberized cluster w/o keytab

Expert Contributor

Hello community,

I've a cluster secured with one way trust relationship with an AD, before enable security in the cluster I was able to excute spark via oozie using a shell-action.

Is there a way to keep doing it without have to propagate my keytab in every nodemamanager? I've seen that for hive from shell-action you can pass the HADOOP_TOKEN_FILE_LOCATION variable to use it, can I do something similar with spark? if not what alternatives do I have?

The problem with the keytab is that I've to change the password every moth so I would have to copy the keytab everytime it changes...

Thank you in advance.

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Lauch spark job from oozie shell-action in kerberized cluster w/o keytab

Explorer

You can upload your keytab file to workflow lib folder so that the keytab will copy to the container folder no matter the job is running on which nodemanager.

Then you can specify the --keytab your-keytab --principal your-principal in your spark-submit command.

But you have to upload the update keytab to workflow lib folder every time you change the password.

View solution in original post

2 REPLIES 2
Highlighted

Re: Lauch spark job from oozie shell-action in kerberized cluster w/o keytab

Contributor

@Juan Manuel Nieto Yes, if its kerberized environment you need to provide the keytab to authenticate. Since you are using shell-action you can use kinit too.

Highlighted

Re: Lauch spark job from oozie shell-action in kerberized cluster w/o keytab

Explorer

You can upload your keytab file to workflow lib folder so that the keytab will copy to the container folder no matter the job is running on which nodemanager.

Then you can specify the --keytab your-keytab --principal your-principal in your spark-submit command.

But you have to upload the update keytab to workflow lib folder every time you change the password.

View solution in original post

Don't have an account?
Coming from Hortonworks? Activate your account here