Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark History UI crashing often?

avatar
New Contributor

We have experienced that the Spark History UI often requires being restarted as we find the service on port 18080 is down, returns Server 500 error. Any reason this might be the case?

HDP 2.2.4.2, Spark 1.2

It was suggested to increase the YARN heap to 8G, which fostered no improvement.

Thanks!

Joshua

1 ACCEPTED SOLUTION

avatar

@Joshua can you increase the SPARK_DAEMON_MEMORY to 4g

SPARK_DAEMON_MEMORYMemory to allocate to the history server (default: 1g).

http://spark.apache.org/docs/latest/monitoring.html

View solution in original post

3 REPLIES 3

avatar
Master Mentor

@Joshua Lickteig are you experiencing This ?

avatar

@Joshua can you increase the SPARK_DAEMON_MEMORY to 4g

SPARK_DAEMON_MEMORYMemory to allocate to the history server (default: 1g).

http://spark.apache.org/docs/latest/monitoring.html

avatar
Cloudera Employee

Hi,

 

Can you share the error message from spark history server logs when the spark history UI is crashing?

 

Thanks

AKR