<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark 2.3 : pyspark.sql.utils.AnalysisException: u&amp;quot;Database 'test' not found;&amp;quot; -  Only default hive database is visible in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190264#M83547</link>
    <description>&lt;P&gt;The default database it was showing was the default database from Spark which has location as '/apps/spark/warehouse', not the default database of Hive.&lt;/P&gt;&lt;P&gt;I am able to resolve this by copying hive-site.xml from hive-conf dir to spark-conf dir.&lt;/P&gt;&lt;PRE&gt;cp /etc/hive/conf/hive-site.xml /etc/spark2/conf&lt;/PRE&gt;&lt;P&gt;Try to run this query in your metastore database, in my case it is MySQL.&lt;/P&gt;&lt;P&gt;mysql&amp;gt; SELECT NAME, DB_LOCATION_URI FROM hive.DBS;&lt;/P&gt;&lt;P&gt;You will see 2 default databases there, one pointing to 'spark.sql.warehouse.dir' and other to 'hive.metastore.warehouse.di'. Location will depend in what value you have for these configuration properties.&lt;/P&gt;</description>
    <pubDate>Sun, 16 Sep 2018 03:53:49 GMT</pubDate>
    <dc:creator>er_sharma_shant</dc:creator>
    <dc:date>2018-09-16T03:53:49Z</dc:date>
    <item>
      <title>Spark 2.3 : pyspark.sql.utils.AnalysisException: u"Database 'test' not found;" -  Only default hive database is visible</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190263#M83546</link>
      <description>&lt;P&gt;I have installed hdp3.0 and using Spark 2.3 and Hive 3.1.&lt;/P&gt;&lt;P&gt;When I am trying to access hive tables though spark(pyspark/spark-shell) then I am getting below error.&lt;/P&gt;&lt;PRE&gt;Traceback (most recent call last):
  File "&amp;lt;stdin&amp;gt;", line 1, in &amp;lt;module&amp;gt;
  File "/usr/hdp/current/spark2-client/python/pyspark/sql/session.py", line 716, in sql
    return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
  File "/usr/hdp/current/spark2-client/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
  File "/usr/hdp/current/spark2-client/python/pyspark/sql/utils.py", line 71, in deco
    raise AnalysisException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: u"Database 'test' not found;"&lt;/PRE&gt;&lt;P&gt;Only default hive database is visible in Spark.&lt;/P&gt;&lt;PRE&gt;&amp;gt;&amp;gt;&amp;gt; spark.sql("show databases").show()
+------------+
|databaseName|
+------------+
|     default|
+------------+
&amp;gt;&amp;gt;&amp;gt;&lt;/PRE&gt;&lt;P&gt;Content of hive-site.xml is not exactly same in spark/conf and hive/conf dir.&lt;/P&gt;&lt;PRE&gt;-rw-r--r-- 1 hive  hadoop 23600 Sep 14 09:21 /usr/hdp/current/hive-client/conf/hive-site.xml
-rw-r--r-- 1 spark spark   1011 Sep 14 12:02 /etc/spark2/3.0.0.0-1634/0/hive-site.xml&lt;/PRE&gt;&lt;P&gt;I even tried initiated spark session with hive/conf/hive-site.xml, even this did not help.&lt;/P&gt;&lt;PRE&gt;pyspark --files /usr/hdp/current/hive-client/conf/hive-site.xml&lt;/PRE&gt;&lt;P&gt;Should I copy hive-site.xml file from hive-conf to spark-conf dir (or anywhere else as well)?&lt;/P&gt;&lt;P&gt;Or changing a property Ambari UI will work?&lt;/P&gt;</description>
      <pubDate>Sat, 15 Sep 2018 03:04:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190263#M83546</guid>
      <dc:creator>er_sharma_shant</dc:creator>
      <dc:date>2018-09-15T03:04:12Z</dc:date>
    </item>
    <item>
      <title>Re: Spark 2.3 : pyspark.sql.utils.AnalysisException: u"Database 'test' not found;" -  Only default hive database is visible</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190264#M83547</link>
      <description>&lt;P&gt;The default database it was showing was the default database from Spark which has location as '/apps/spark/warehouse', not the default database of Hive.&lt;/P&gt;&lt;P&gt;I am able to resolve this by copying hive-site.xml from hive-conf dir to spark-conf dir.&lt;/P&gt;&lt;PRE&gt;cp /etc/hive/conf/hive-site.xml /etc/spark2/conf&lt;/PRE&gt;&lt;P&gt;Try to run this query in your metastore database, in my case it is MySQL.&lt;/P&gt;&lt;P&gt;mysql&amp;gt; SELECT NAME, DB_LOCATION_URI FROM hive.DBS;&lt;/P&gt;&lt;P&gt;You will see 2 default databases there, one pointing to 'spark.sql.warehouse.dir' and other to 'hive.metastore.warehouse.di'. Location will depend in what value you have for these configuration properties.&lt;/P&gt;</description>
      <pubDate>Sun, 16 Sep 2018 03:53:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190264#M83547</guid>
      <dc:creator>er_sharma_shant</dc:creator>
      <dc:date>2018-09-16T03:53:49Z</dc:date>
    </item>
    <item>
      <title>Re: Spark 2.3 : pyspark.sql.utils.AnalysisException: u"Database 'test' not found;" -  Only default hive database is visible</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190265#M83548</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I am trying to run spark application which will need access to Hive databases. But Hive databases like FOODMART are not visible in spark session. &lt;/P&gt;&lt;P&gt;I did &lt;STRONG&gt;spark.sql("show databases").show()&lt;/STRONG&gt;; it is not showing Foodmart database, though spark session is having enableHiveSupport.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Below i've tried:&lt;/P&gt;&lt;P&gt;1) &lt;/P&gt;&lt;P&gt;cp /etc/hive/conf/hive-site.xml /etc/spark2/conf&lt;/P&gt;&lt;P&gt;2)&lt;/P&gt;&lt;P&gt;Changed spark.sql.warehouse.dir in spark UI from &lt;STRONG&gt;/apps/spark/warehouse&lt;/STRONG&gt; to &lt;STRONG&gt;/warehouse/tablespace/managed/hive&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Even though it is not working.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Please let me know what configuration changes would be required to have this. &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Please note - Above is working in HDP2.6.5.&lt;/P&gt;</description>
      <pubDate>Tue, 07 May 2019 04:22:05 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-3-pyspark-sql-utils-AnalysisException-u-quot/m-p/190265#M83548</guid>
      <dc:creator>shashank_naresh</dc:creator>
      <dc:date>2019-05-07T04:22:05Z</dc:date>
    </item>
  </channel>
</rss>

