Member since
10-11-2019
1
Post
1
Kudos Received
0
Solutions
02-02-2022
08:15 AM
Spark and Hive use separate catalogs to access SparkSQL or Hive tables in HDP 3.0 and later. The Spark catalog contains a table created by Spark. The Hive catalog contains a table created by Hive. By default, standard Spark APIs access tables in the Spark catalog. To access tables in the hive catalog, we have to edit the metastore.catalog.default property in hive-site.xml (Set that property value to 'hive' instead of 'spark'). Config File Path: $SPARK_HOME/conf/hive-site.xml Before change the config <property>
<name>metastore.catalog.default</name>
<value>spark</value>
</property> After change the config <property>
<name>metastore.catalog.default</name>
<value>hive</value>
</property>
... View more