Hello,
I have spark yarn client submitting jobs and when it does that, it creates a directory under my "HadoopTmp" which has files like for each submitted application:
__spark_conf__8681611713144350374.zip
__spark_libs__4985837356751625488.zip
Is there a way these can be automatically cleaned? Whenever I submit a spark job I see new entries for these again in the same folder. This is flooding up my directory what should I set to make this clear automatically? Or should i manually delete them from HDFS ?