Member since
03-06-2018
1
Post
1
Kudos Received
0
Solutions
03-06-2018
09:26 AM
1 Kudo
This workaround has a severe problem. val options = Map("path" -> "this is the path to your warehouse") Do NOT do this. When you specify the "path" as just the warehouse location, Spark will assume that is the location that needs to be purged during an overwrite. This can wipe everything in your warehouse. So, if you put "user/hive/warehouse" it will delete everything in "user/hive/warehouse". This is bad and should not have been marked as the accepted answer. I think the only reason maziyar didn't have everything wiped is because he is using separate warehouses for each db...or he is actually specifying the full path to each table.
... View more