I am submitting a shell script that run multiple Hadoop jobs one after another.
I do kinit
Submit the script and logoff from the server.
Problem here is:
Only the first job executes successfully and rest of teh jobs fails with Kerberos ticket not found error.
I understand that once I logout from server the Kerberos credential get lost.
The first job is successful because namenode stores credential as HDFS Delgation token.
How can I use the credentials created before I logout to execute jobs that start running after I logout.
Best is to leverage tools like Oozie which will handle ticket renewal.
Alternatively, you can try running your script file with nohup command, so that logout doesn't affect the script from running
Hi Vijay and sagar,
Thanks for your inputs.
Can you please suggest me the module in Oozie that does the renewal work.