A simple Spark1 Java application to show a list of tables in Hive Metastore is as follows:
import org.apache.spark.SparkContext;
import org.apache.spark.SparkConf;
import org.apache.spark.sql.hive.HiveContext;
import org.apache.spark.sql.DataFrame;
public class SparkHiveExample {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("SparkHive Example");
SparkContext sc = new SparkContext(conf);
HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);
DataFrame df = hiveContext.sql("show tables");
df.show();
}
}
Note that Spark pulls metadata from Hive metastore and also uses hiveql for parsing queries but the execution of queries as such happens in the Spark execution engine.