Member since
10-28-2017
1
Post
0
Kudos Received
0
Solutions
10-28-2017
06:44 PM
When I do the above - my job fails with permission error "Permission denied: user=myuser, access=EXECUTE, inode="/jobs/logs/dr.who/logs":dr.who:hadoop:drwxrwx---". My cluster is not secure but has dfs.permissions = true and I would prefer to not disable it. Every time the job runs from API it tries to run as user dr.who and that user does not exists in reality. When I run jobs with spark-submit or hadoop jar they run with logged in user. I understand that "dr.who" is a http static user - but how do I submit the job as some other "valid" user supplied in the API? The YARN API documentation is ambiguous as hell, I mean what in the world does this line even mean to a non-ops person- “Please note that in order to submit an app, you must have an
authentication filter setup for the HTTP interface. The functionality
requires that a username is set in the HttpServletRequest. If no filter
is setup, the response will be an “UNAUTHORIZED” response.” Does this mean that - the REST API can only be used by Java or JVM based client? (HttpServeletRequest is a Java class)... So please help me with how one can get around or solve the user running the application not being "dr.who"?
... View more