While trying to run hive queries using hive context in spark-shell , the below exception is raised.
scala> import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.hive.HiveContext
scala> val hc = new org.apache.spark.sql.hive.HiveContext(sc)
hc: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@44e07443
scala> val hivequery = hc.sql(“show databases”)
17/12/31 10:58:44 ERROR Datastore.Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set
I am not able to execute any hive query through spark-shell. can not use meta store ( saved hive tables ) in Spark.
I am using cloudera-quickstartVm-5.12.