Although this thread is quite old, I post my solution, as it may still be helpful to some people.
I encountered the problem in spark2-shell, as the hive metastore.uri was not set in client mode,
After adding the following to spark-defaults.conf (via cloudera manger's
Spark 2 Client Advanced Configuration Snippet (Safety Valve) for spark2-conf/spark-defaults.conf)
spark.sql.warehouse.dir=hdfs://nameservice1/user/hive/warehouse
spark.executor.extraJavaOptions=-Dhive.metastore.uris=thrift://hive.metastore.net:9083
spark.driver.extraJavaOptions=-Dhive.metastore.uris=thrift://hive.metastore.net:9083
I could list and access all tables from the hive metastore from within spark-shell.