I have a simple pyspark application that queries a hive table using HiveContext and dumps the result to a csv file on to HDFS. The application runs fine on my cluster (kerberized) when I do spark submit. I do have a valid ticket, so there isn't any issue in submitting the job.
When I run the same job via the Ambari WorkFlow manager, I get a GSS Exception : No Valid Credentials Provided. How do I pass on the user credentials to the workflow?