We use Jupyterhub with our HDP cluster.
When starting a PySpark kernel we have the problem the the link to the Spark UI is not working. When following the link we get an HTTP ERROR 500 with the following detail:
javax.servlet.ServletException: Could not determine the proxy server for redirection at org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.findRedirectUrl(AmIpFilter.java:205)
The Spark application itself is running and the Spark UI can also be accessed by following the link in the Resourcemanager UI.
We are currently thinking that this problem is related with an issue in YARN in a secure HA setup:
This issue seems to be fixed in YARN 2.9.0 but the current version in HDP is 2.7.3.
My question is: Are you aware of that issue and do you maybe know any workarounds?
We are already using HDP 2.6.3 (upgraded from 2.6.2 several weeks ago) but anyway facing this issue. Could that also be related to something else?
In the YARN config in Ambari, I still see that config params related to non-HA are still set, like
HA-related configs are also present like
But Amabri does not allow me to remove the old settings related to non-HA. It says that this field is required. Is that normal behavior? The YARN documentation for HA says that the old settings must be replaced with the ones related to HA.