Spark History server would replay the logs as soon as it gets the files (eventlogs) in the configured HDFS path [ /user/spark/applicationHistory]. The replay operation just reads the Event logs from HDFS path and loads in to memory to make it available for rendering.
In your case, you have already confirmed that the file is present on the HDFS event logging directory. As a next step, could you please review the Spark History server logs and check if the replay operation is happening?
Also, there are chances that if the file/directory permissions of event logs are incorrect the replay operation would fail silently, In such scenarios, you might need to enable DEBUG level logs to review whats wrong with replay operations.
Hi @satz I've checked the spark history logs and it says that it has a read permission denied for the user named "spark". I've change recursively the /user/spark permission and ownership to spark but when there is a new file, it has its own permission type so it can't be read again by spark.