Support Questions

Find answers, ask questions, and share your expertise

Spark Driver and Executor logs in yarn cluster mode for CDP clusters

avatar
Expert Contributor

Where is the location for the Spark driver and executor logs when the Spark job is executed in YARN cluster mode?

Does yarn logs -applicationId CMD capture the logs? I can't find log4j-active.log for the driver.

1 REPLY 1

avatar
Master Collaborator

Hi @zhuw.bigdata 

To locate Spark logs, follow these steps:

  1. Access the Spark UI: Open the Spark UI in your web browser.
  2. Identify Nodes: Navigate to the Executors tab to view information about the driver and executor nodes involved in the Spark application.
  3. Determine Log Directory: Within the Spark UI, find the Hadoop settings section and locate the value of the yarn.nodemanager.log-dirs property. This specifies the base directory for Spark logs on the cluster.
  4. Access Log Location: Using a terminal or SSH, log in to the relevant node (driver or executor) where the logs you need are located.
  5. Navigate to Application Log Directory: Within the yarn.nodemanager.log-dirs directory, access the subdirectory for the specific application using the pattern application_${appid}, where ${appid} is the unique application ID of the Spark job.
  6. Find Container Logs: Within the application directory, locate the individual container log directories named container_{$contid}, where ${contid} is the container ID.
  7. Review Log Files: Each container directory contains the following log files generated by that container:
    • stderr: Standard error output
    • stdin: Standard input (if applicable)
    • syslog: System-level logs