I'm writing some pyspark code where I have a dataframe that I want to write to a hive table. I'm using a command like this. dataframe.write.mode("overwrite").saveAsTable(“bh_test”) Everything I've read online indicates that this should, by default, create a managed table. However, no matter what I try, it always creates an external table. Is there a configuration setting somewhere that overrides the default behavior?
... View more