Spark SQL 2.3. CDH 5.16.2
I am trying to save DF to a table in spark. The table already created in Hive.
But when Spark job running I do a show create table t_table in Hive.
It gives me table not found error. after spark job finish I can find the table again.
My questions is
Is there any way to avoid such issue? It will cause issue if other programs refer same table in Hive