Created 01-10-2018 10:40 PM
I
am trying to use a java filter to protect the access to spark ui, this
by using the property spark.ui.filters; the problem is that when spark
is running on yarn mode, that property is being allways overriden with the filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter:
spark.ui.filters: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
And this properties are automatically added:
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS: ip-x-x-x-226.eu-west-1.compute.internal
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES: http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx
Any
suggestion of how to add a java security filter so it does not
get overriden, or maybe how to configure the security from hadoop side?
Environment:
AWS EMR, yarn cluster.
Thanks.
Created 03-05-2018 02:12 PM
This is solved by using the property hadoop.http.authentication.type to specify a custom Java Handler objects that contains the authentication logic. This class only has to implement the interface org.apache.hadoop.security.authentication.server.AuthenticationHandler. See:
https://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/HttpAuthentication.html
Created 03-05-2018 02:12 PM
This is solved by using the property hadoop.http.authentication.type to specify a custom Java Handler objects that contains the authentication logic. This class only has to implement the interface org.apache.hadoop.security.authentication.server.AuthenticationHandler. See:
https://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/HttpAuthentication.html