Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Spark History UI crashing often?

avatar

We have experienced that the Spark History UI often requires being restarted as we find the service on port 18080 is down, returns Server 500 error. Any reason this might be the case?

HDP 2.2.4.2, Spark 1.2

It was suggested to increase the YARN heap to 8G, which fostered no improvement.

Thanks!

Joshua

1 ACCEPTED SOLUTION

avatar

@Joshua can you increase the SPARK_DAEMON_MEMORY to 4g

SPARK_DAEMON_MEMORYMemory to allocate to the history server (default: 1g).

http://spark.apache.org/docs/latest/monitoring.html

View solution in original post

3 REPLIES 3

avatar
Master Mentor

@Joshua Lickteig are you experiencing This ?

avatar

@Joshua can you increase the SPARK_DAEMON_MEMORY to 4g

SPARK_DAEMON_MEMORYMemory to allocate to the history server (default: 1g).

http://spark.apache.org/docs/latest/monitoring.html

avatar
Cloudera Employee

Hi,

 

Can you share the error message from spark history server logs when the spark history UI is crashing?

 

Thanks

AKR