<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Hadoop : SparkSQL context to include Hive in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hadoop-SparkSQL-context-to-include-Hive/m-p/164573#M36895</link>
    <description>&lt;P&gt;I added a config property called spark.local.dir, and this
seemed to resolve this issue below, and I can select from tables when
connecting through port 10015 in beeline.&lt;/P&gt;&lt;P&gt;I set it to /tmp, as it needs to be writeable by the spark
process. I tried making a sub-directory called /tmp/spark-tmp, and change
ownership to spark:hadoop but it didn’t like it for some reason. Maybe because
it wasn’t executable.&lt;/P&gt;&lt;P&gt;From /var/log/spark/spark-hive-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-mdzusvpclhdp001.mdz.local.out:&lt;/P&gt;&lt;P&gt;16/08/03 14:58:14 ERROR DiskBlockManager: Failed to create local
dir in /tmp/spark-tmp. Ignoring this directory.&lt;/P&gt;&lt;P&gt;java.io.IOException: Failed to create a temp directory (under
/tmp/spark-tmp) after 10 attempts!&lt;/P&gt;</description>
    <pubDate>Thu, 04 Aug 2016 14:48:56 GMT</pubDate>
    <dc:creator>Former Member</dc:creator>
    <dc:date>2016-08-04T14:48:56Z</dc:date>
    <item>
      <title>Hadoop : SparkSQL context to include Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hadoop-SparkSQL-context-to-include-Hive/m-p/164573#M36895</link>
      <description>&lt;P&gt;I added a config property called spark.local.dir, and this
seemed to resolve this issue below, and I can select from tables when
connecting through port 10015 in beeline.&lt;/P&gt;&lt;P&gt;I set it to /tmp, as it needs to be writeable by the spark
process. I tried making a sub-directory called /tmp/spark-tmp, and change
ownership to spark:hadoop but it didn’t like it for some reason. Maybe because
it wasn’t executable.&lt;/P&gt;&lt;P&gt;From /var/log/spark/spark-hive-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-mdzusvpclhdp001.mdz.local.out:&lt;/P&gt;&lt;P&gt;16/08/03 14:58:14 ERROR DiskBlockManager: Failed to create local
dir in /tmp/spark-tmp. Ignoring this directory.&lt;/P&gt;&lt;P&gt;java.io.IOException: Failed to create a temp directory (under
/tmp/spark-tmp) after 10 attempts!&lt;/P&gt;</description>
      <pubDate>Thu, 04 Aug 2016 14:48:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hadoop-SparkSQL-context-to-include-Hive/m-p/164573#M36895</guid>
      <dc:creator>Former Member</dc:creator>
      <dc:date>2016-08-04T14:48:56Z</dc:date>
    </item>
    <item>
      <title>Re: Hadoop : SparkSQL context to include Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hadoop-SparkSQL-context-to-include-Hive/m-p/164574#M36896</link>
      <description>&lt;P&gt;@&lt;A href="https://community.hortonworks.com/users/3134/sanchinakishore.html"&gt;kishore sanchina&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Your spark use must be able to create folder under that /tmp/spark-tmp. Based on your comments you did not grant ownership successfully. You should grant recursive ownership of /tmp as such that it will include all subfolders existent or created at runtime:&lt;/P&gt;&lt;PRE&gt;chown spark -R /tmp&lt;/PRE&gt;&lt;P&gt;I assumed your user is spark.&lt;/P&gt;&lt;P&gt;However, I really don't like the idea of using /tmp for that (SA taste).You should use maybe a folder created under SPARK_HOME.&lt;/P&gt;</description>
      <pubDate>Tue, 27 Dec 2016 06:25:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hadoop-SparkSQL-context-to-include-Hive/m-p/164574#M36896</guid>
      <dc:creator>cstanca</dc:creator>
      <dc:date>2016-12-27T06:25:46Z</dc:date>
    </item>
  </channel>
</rss>

