Support Questions

Find answers, ask questions, and share your expertise

How to create security filter for Spark UI in Spark on YARN

avatar
New Contributor

I am trying to use a java filter to protect the access to spark ui, this by using the property spark.ui.filters; the problem is that when spark is running on yarn mode, that property is being allways overriden with the filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter:

spark.ui.filters: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter

And this properties are automatically added:

spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS: ip-x-x-x-226.eu-west-1.compute.internal
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES: http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx


Any suggestion of how to add a java security filter so it does not get overriden, or maybe how to configure the security from hadoop side?

Environment:
AWS EMR, yarn cluster.


Thanks.

1 ACCEPTED SOLUTION

avatar
New Contributor

This is solved by using the property hadoop.http.authentication.type to specify a custom Java Handler objects that contains the authentication logic. This class only has to implement the interface org.apache.hadoop.security.authentication.server.AuthenticationHandler. See:

https://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/HttpAuthentication.html

View solution in original post

1 REPLY 1

avatar
New Contributor

This is solved by using the property hadoop.http.authentication.type to specify a custom Java Handler objects that contains the authentication logic. This class only has to implement the interface org.apache.hadoop.security.authentication.server.AuthenticationHandler. See:

https://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/HttpAuthentication.html