How to get to know the selected spark executors temporary data and all variable size at particular point in program while running ? or spark application temp data/variables size while running ?
for example : reading an object from s3 and loaded into python dataframe . at the time of processing python dataframe . i would like to know the how much temp data has been created and what is the size of all existing variables at perticular point in program