Support Questions

Find answers, ask questions, and share your expertise

From Zeppelin Spark2 interpreter, I can show Hive tables, but cannot query them. See attached screen capture below.

New Contributor

When querying Hive table from Zeppelin notebook using spark2 interpreter, I can show tables in Hive without error using the following commands:


val sqlContext = new org.apache.saprk.sql.hive.HiveContext(sc)

val tables = sqlContext.sql("show tables")

But I cannot query Hive table, when I tried to run the following statement, got error:

sqlContext.sql("select * from t1 LIMIT 10").collect().foreach(println) dev (see attached image for more info)

My cluster name is Dev. Is that due to case-sensitive?



Is 'dev' your nameservice name of HDFS ? If yes, please make sure HADOOP_CONF_DIR is set in zeppelin-env or just copy core-site.xml and hdfs-site.xml to Zeppelin conf directory.

Hope this helps.

Super Mentor

@Mike Li

In addition to Sandeep's comment,

Can you also check the following:

Possible Cause:

1. If your "/etc/hosts" file contains correct FQDN (fully qualified domain name) and IP addresse information.
2. It can also happen due to DNS setup issues. So please check if the namenode host is resolved to correct domain.

In some cases it can happen when the "" property is set to "false"
Can for testing can you try setting it to "true" once as following to see if it helps: (default=true) is needed if your network is setup such that you need to use hostnames in delegation tokens instead of ip addresses.

When ( = true) there is a lookup done on the IP address of the sender, which is compared to the SPN in the token.
You can add this property inside your "core-site.xml"

For more information on this please refer to:



@Mike Li

Are you able to resolve this issue?