I
am trying to use a java filter to protect the access to spark ui, this
by using the property spark.ui.filters; the problem is that when spark
is running on yarn mode, that property is being allways overriden with the filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter:
spark.ui.filters: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
And this properties are automatically added:
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS: ip-x-x-x-226.eu-west-1.compute.internal
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES: http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx
Any
suggestion of how to add a java security filter so it does not
get overriden, or maybe how to configure the security from hadoop side?
Environment:
AWS EMR, yarn cluster.
Thanks.