Support Questions

Find answers, ask questions, and share your expertise

Workaround for YARN Issue 6625

avatar

We use Jupyterhub with our HDP cluster.
When starting a PySpark kernel we have the problem the the link to the Spark UI is not working. When following the link we get an HTTP ERROR 500 with the following detail:

javax.servlet.ServletException: Could not determine the proxy server for redirection
at org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.findRedirectUrl(AmIpFilter.java:205)

The Spark application itself is running and the Spark UI can also be accessed by following the link in the Resourcemanager UI.

We are currently thinking that this problem is related with an issue in YARN in a secure HA setup:
https://issues.apache.org/jira/browse/YARN-6625

This issue seems to be fixed in YARN 2.9.0 but the current version in HDP is 2.7.3.

My question is: Are you aware of that issue and do you maybe know any workarounds?

2 REPLIES 2

avatar
@Alexander Schätzle

There is no known workaround for YARN-6625. The issue is fixed in HDP 2.6.3 (fixed issue doc). If you are using lower version, then you could consider upgrading to latest version with fix.

avatar

@Sindhu

We are already using HDP 2.6.3 (upgraded from 2.6.2 several weeks ago) but anyway facing this issue. Could that also be related to something else?

In the YARN config in Ambari, I still see that config params related to non-HA are still set, like

yarn.resourcemanager.hostname=host2

HA-related configs are also present like

yarn.resourcemanager.hostname.rm1=host1
yarn.resourcemanager.hostname.rm2=host2

But Amabri does not allow me to remove the old settings related to non-HA. It says that this field is required. Is that normal behavior? The YARN documentation for HA says that the old settings must be replaced with the ones related to HA.