Created 10-26-2018 12:28 PM
Is there a proper way to monitor the memory usage of a spark application.
By memory usage, i didnt mean the executor memory, that can be set, but the actual memory usage of the application.
Note : We are running Spark on YARN
Created 11-03-2018 07:49 PM
I am not able to reply to your comment. Seems like the Reply option is not available.
Hence replying here
Comment
HDFS Write bytes by executor should look something like this (be sure to set the left Y unit type to bytes); aliasByNode($application.*.executor.filesystem.*.write_bytes, 1) Executor and Driver memory usage example (similarly as above set the left Y unit to bytes); aliasByNode($application.*.jvm.heap.used, 1) I'll try to find time later to give you some more examples, but they are mostly slight variations on the examples above : - )
Thanks for the comment. Will try the metrics queries and will let you know.
Also looking forward to your updates on the below as well
Created 11-05-2018 07:38 PM
@Jonathan Sneep
Could you please check if the below metrics queries are correct :
Also please let me know the queries for the below :
Looking forward to your update regarding the same.
Created 11-07-2018 09:24 AM
Did you had a chance to look into it.
Created 11-02-2018 12:48 PM
Nice work. HDFS Write bytes by executor should look something like this (be sure to set the left Y unit type to bytes);
aliasByNode($application.*.executor.filesystem.*.write_bytes, 1)
Executor and Driver memory usage example (similarly as above set the left Y unit to bytes);
aliasByNode($application.*.jvm.heap.used, 1)
I'll try to find time later to give you some more examples, but they are mostly slight variations on the examples above : - )
Created 11-07-2018 01:25 PM
@Jonathan Sneep
Could you please check if the below metrics queries are correct :
Also please let me know the queries for the below :
Looking forward to your update regarding the same.