- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Logs from spark executors
- Labels:
-
Apache Spark
Created on ‎04-17-2015 05:21 PM - edited ‎09-16-2022 02:26 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Where can I find the logs from spark functions? I can find the logs of the spark application, but if I need to debug a map function that I wrote, I cannot find these logs.
thanks!
Created ‎04-17-2015 10:05 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If you print or log to stdout, it goes to the stdout of the executor process, wherever that is running. In YARN-based deployment, you can use "yarn logs ..." to find the executor logs, I believe. Or dig through from the resource manager and find the executor process and its logs from the UI.
Created ‎04-18-2015 05:19 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for your reply.
I found in the UI the path to /var/log/spark/spark-worker. That is the spark executor's own log. I am actually looking for the logs that I added to functions that I wrote to map RDDs.
Please forgive me if I am using the wrong terminology here.
