Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Spark history server has some issue.


I can't see Spark history server's user interface, but only
Event log directory: hdfs://server/user/spark/history

On the web UI.

Recently when I met this issue I managed to resolve it, I increased DAEMON_MEMORY_SIZE and HEAP_SIZE for the  history server. 
But it only helped for a couple of days, I have to constantly restart it.
Could someone help, please?


Expert Contributor

Hello @monorels ,


Thank you for posting the query.


Check the total size of the files present on the event logging directory. possibly if you have huge sized files you would need to increase the spark history server's heap memory. Also check from the logs (spark history server logs) and see if you are getting any errors while spark history server replays the event logs from HDFS path. If there are no errors observed try to enable DEBUG level logging. Even still no errors logged, possibly you might need to check the number of event logs and the allocated heap memory usage.


New Contributor

I running with helm chart
using image apache/spark:3.5.1 got the same issue, Could you give me an advise how to get through

Community Manager

@duyvo Welcome to the Cloudera Community!

As this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post. Thanks.


Diana Torres,
Community Moderator

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

Master Collaborator

Based on event log files, you need to adjust Spark History Server settings. Could you please check SHS cleanup is enabled or not. If you enable spark automatically it clean the old event log files. To load larger event log files, you need to adjust the DAEMON_MEMORY_SIZE.

You can refer the following article to adjust the SHS parameters: