I used .saveAsTable on my DataFrame and now it is stored in my HDFS hive warehouse metastore. How can I load this back into Spark SQL? I have deleted my cluster (Azure HDInsight) and created a new one, confirmed my Hive metastore location is the same and the directory is still there.
I need to load this again as a persistent table, not as a temp table as I am using the PowerBI/Spark connector. The only way I have found to do so far is to load the directory back into a DF, then run .saveAsTable again.. which is writing the file again and takes a long time to process. I'm hopeful there is a better way!!