Reply
New Contributor
Posts: 4
Registered: ‎10-23-2017

Re: how to access the hive tables from spark-shell

Hi!

 

Have you installed the appropriate Gateways on the server where these configuration settings are required?

Explorer
Posts: 6
Registered: ‎09-08-2017

Re: how to access the hive tables from spark-shell

Yes, I have a spark gateway on the host and I copied hive-site.xml into /etc/spark/conf.


New Contributor
Posts: 4
Registered: ‎10-23-2017

Re: how to access the hive tables from spark-shell

On the Spark configuration page i dont have Hive checkbox too.

Try to install another version of Spark.

Highlighted
New Contributor
Posts: 1
Registered: ‎05-20-2018

Re: how to access the hive tables from spark-shell

Try "select * from db.table" in line 3

New Contributor
Posts: 1
Registered: ‎10-15-2018

Re: how to access the hive tables from spark-shell

Hi,


I am trying to access the already existing table in hive by using pyspark

e.g. in hive table is existing name as "department" in default database.

err msg :- 

 

18/10/15 22:01:23 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
18/10/15 22:02:35 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0-cdh5.13.0
18/10/15 22:02:38 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException

 

I checked the below files, they are same.

 

/usr/lib/hive/conf/hive-site.xml

 

/usr/lib/spark/conf/hive-site.xml

 

Any help on how to set up the HiveContext from pyspark is highly appreciated.

Announcements