If you want to follow the memory usage of individual executors for spark, one way that is possible is via configuration of the spark metrics properties. I've previously posted the following guide that may help you set this up if this would fit your use case;
I am not able to reply to your comment. Seems like the Reply option is not available.
Hence replying here
HDFS Write bytes by executor should look something like this (be sure to set the left Y unit type to bytes);
Executor and Driver memory usage example (similarly as above set the left Y unit to bytes);
I'll try to find time later to give you some more examples, but they are mostly slight variations on the examples above : - )
Thanks for the comment. Will try the metrics queries and will let you know.
Also looking forward to your updates on the below as well