Hi, I want to call spark from oozie workflow and also delete some files from hdfs and write the new files with same name under the same dir. I put a script into hdfs that execute these commands but I got some errors. I am doing this from HUE UI. The first error is that when i delete the files and rewrite them they are changing permissions and i can't delete them again(permission denied). I managed this by unchecking the option in cloudera manager --> Hdfs check permissions. The second error, that i can't figure it out, is that when the script calls spark i get [Errno13] Permission Denied. The command that script executes is "spark-submit bin/cp_frcast_revenue.py". The file cp_frcast_revenue.py has ugo+rwx permissions. Can you please help me?