Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

how to access the hive tables from spark-shell

Re: how to access the hive tables from spark-shell

Explorer
Yes, I have a spark gateway on the host and I copied hive-site.xml into /etc/spark/conf.


Highlighted

Re: how to access the hive tables from spark-shell

New Contributor

On the Spark configuration page i dont have Hive checkbox too.

Try to install another version of Spark.

Re: how to access the hive tables from spark-shell

New Contributor
I tried this. but its permission denied.
Can you please help

Re: how to access the hive tables from spark-shell

New Contributor

Hi,

 

Did u fix this issue?

Re: how to access the hive tables from spark-shell

New Contributor

Try "select * from db.table" in line 3

Re: how to access the hive tables from spark-shell

New Contributor

Hi,


I am trying to access the already existing table in hive by using pyspark

e.g. in hive table is existing name as "department" in default database.

err msg :- 

 

18/10/15 22:01:23 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
18/10/15 22:02:35 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0-cdh5.13.0
18/10/15 22:02:38 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException

 

I checked the below files, they are same.

 

/usr/lib/hive/conf/hive-site.xml

 

/usr/lib/spark/conf/hive-site.xml

 

Any help on how to set up the HiveContext from pyspark is highly appreciated.

Re: how to access the hive tables from spark-shell

New Contributor

Hi there, 

 

Just in case someone still needs the solution, here is what i tried and it works.

 

spark-shell --driver-java-options "-Dhive.metastore.uris=thrift://quickstart:9083"

 

I am using spark 1.6 with cloudera vm. 

 

val df=sqlContext.sql("show databases")

df.show

 

You should be able to see all the databases in hive. I hope it helps.

Don't have an account?
Coming from Hortonworks? Activate your account here