Reply
Highlighted
Expert Contributor
Posts: 99
Registered: ‎07-01-2015
Accepted Solution

Spark failed to get database default

Hi, 

 I just did a fresh clean install of CDH 5.11 and with Hive and Spark, everythin in dev mode - so embedded database, no HA, simple setup.

 

When I try to run spark-shell a got an exception:

 

Spark context available as sc (master = yarn-client, app id = application_1498152485620_0001).
17/06/22 19:30:59 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0
17/06/22 19:30:59 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.

 

 

Is this some bug related to Spark 1.6?

 

Thanks

Tomas

 

Expert Contributor
Posts: 99
Registered: ‎07-01-2015

Re: Spark failed to get database default

The interesting thing is, that if I download spark 2.1 and configure it (point to hadoop conf dir in etc), it just works ok in YARN a does not have a problem to show databases or show tables.
Posts: 566
Topics: 3
Kudos: 79
Solutions: 52
Registered: ‎08-16-2016

Re: Spark failed to get database default

The database it is trying to access is the backend to the Hive Metastore.  Are you able to access and view databases and tables in Hive?

Champion
Posts: 424
Registered: ‎05-16-2016

Re: Spark failed to get database default

@mbigelowi agree with you on this.

 

Could you check your Metastore / HS2  status

 

sudo service hive-metastore status
sudo service hive-server2 status

 

Expert Contributor
Posts: 99
Registered: ‎07-01-2015

Re: Spark failed to get database default

I am sorry, but probably my mistake. I dont know why but now the sqlContext.sql("show tables").collect gathers the tables so I am able to access the metastore. The warning message is still there during spark-shell startup. But it works. 

 

 

Announcements