First, the SparkR Backend Handler shouldn't be so fragile as to fail simply because I got stuck in a one-hour meeting which interrupted me in the middle of trying to construct a script. SparkR seems to be the red-headed bastard stepchild of the Spark API family and it definitely shows... so someone from Hortonworks should either kill it forever and force me to switch to python, or put your money where your software is and make this puppy more robust...
But even though Hortonworks may ignore R & SparkR, I now have to kill the hanging yarn application. Thank the lord I have 'sudo' privileges ...which used to work! Until 2.6.5! Now when I run the command to kill the application_Id, I'm getting the following error:
Killing application application_1536084816396_0023
Exception in thread "main" org.apache.hadoop.yarn.exceptions.YarnException: java.security.AccessControlException: User hdfs cannot perform operation MODIFY_APP on application_1536084816396_0023
Can anyone tell me why my sudo call is now interpreting me as the 'hdfs' user?