Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

SparkR Backend hangs, then 'sudo yarn application -kill ' ...now thinks I'm user 'hdfs'

Highlighted

SparkR Backend hangs, then 'sudo yarn application -kill ' ...now thinks I'm user 'hdfs'

New Contributor

First, the SparkR Backend Handler shouldn't be so fragile as to fail simply because I got stuck in a one-hour meeting which interrupted me in the middle of trying to construct a script. SparkR seems to be the red-headed bastard stepchild of the Spark API family and it definitely shows... so someone from Hortonworks should either kill it forever and force me to switch to python, or put your money where your software is and make this puppy more robust...

But even though Hortonworks may ignore R & SparkR, I now have to kill the hanging yarn application. Thank the lord I have 'sudo' privileges ...which used to work! Until 2.6.5! Now when I run the command to kill the application_Id, I'm getting the following error:

Killing application application_1536084816396_0023 Exception in thread "main" org.apache.hadoop.yarn.exceptions.YarnException: java.security.AccessControlException: User hdfs cannot perform operation MODIFY_APP on application_1536084816396_0023

Can anyone tell me why my sudo call is now interpreting me as the 'hdfs' user?

Don't have an account?
Coming from Hortonworks? Activate your account here