Support Questions

Find answers, ask questions, and share your expertise

Multiple spark history servers - how to control address in clients' spark-defaults.conf ?

avatar
Explorer

Hi experts

The scenario is this:
Cluster with one Spark History Server on host Y.
Using curl I added another Spark History Server to a host X.
I also created a different config group for the new SHS, with it's own specific spark.yarn.history.address with host X
The goal is that all clients will write their logs to the default SHS on host Y and whoever needs the new SHS will have to set it manually in his spark app arguments.

Problem is, after adding the new SHS and performing "refresh configs" for spark-clients all of them are getting the new SHS address in their /etc/spark2/conf/spark-defaults.conf (spark.yarn.historyServer.address = X:18081)

Where do i set which spark.yarn.historyServer.address the clients will be using ?

Thx

Adi

1 ACCEPTED SOLUTION

avatar
Explorer

Issue resolved.
In case anyone needs to add another spark history server, so in order to control clients' config regarding spark history server address:
In Ambari >>> spark >>> config >>> spark.yarn.history.addr
Replace the variables {{spark_history_server_host}}:{{spark_history_ui_port}} with hardcoded server address and port for each config group.
Do NOT use the variables.

View solution in original post

1 REPLY 1

avatar
Explorer

Issue resolved.
In case anyone needs to add another spark history server, so in order to control clients' config regarding spark history server address:
In Ambari >>> spark >>> config >>> spark.yarn.history.addr
Replace the variables {{spark_history_server_host}}:{{spark_history_ui_port}} with hardcoded server address and port for each config group.
Do NOT use the variables.