I am facing a situation where I am get different explains plans for the exact same operation on the exact same dataframe in my spark code.
I understand that this is because spark does multiple iterations over the code and try and optimize the code in different ways and the number of times it iterates is defined by the parameter spark.sql.optimizer.maxIterations.
My problem is that there is a very heavy operation that I would like to perform and the optimizer does a very good job sometimes and a rather poor job sometimes and this results in almost 30% difference in performance between the query working on the same dataset.
Given this, my question are,
Is it possible to freeze the explain plan for an operation somehow in spark?
Bumping up the spark.sql.optimizer.maxIterations to 500 doesnt seem to fix it either. Is this parameter even used?