<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Druid on Hive LLAP - HDP2.6.1 in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182748#M77853</link>
    <description>&lt;PRE&gt;Hi,
I am creating a table in Hive (LLAP) using an existing table. as follows:

CREATE TABLE test_druid STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' 
TBLPROPERTIES (   "druid.datasource" = "test_druid",   "druid.segment.granularity" = "MONTH",   "druid.query.granularity" = "DAY") 
as  select cast(trans_date as timestamp) as `__time` , col1, col2, col3 from testdb.test_hive_Table where to_date(trans_Date) &amp;gt;= '2018-01-01';

I am getting following error. No where i mentioned the druid storage as "/druid/segments". I dont know from where it is picking it. 

In Ambari, i have set druid.storage.storageDirectory=/user/druid/data. Not sure what is causing the issue.
Please help.

	Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/druid/segments/ff7e385a0fbf4d84973be64d88f31c02":hdfs:hdfs:drwxr-xr-x
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
        at org.apache.hadoop.hdfs.server.namenode.FS

&amp;lt;br&amp;gt;&lt;/PRE&gt;</description>
    <pubDate>Fri, 16 Sep 2022 13:10:08 GMT</pubDate>
    <dc:creator>lnc_adoni</dc:creator>
    <dc:date>2022-09-16T13:10:08Z</dc:date>
    <item>
      <title>Druid on Hive LLAP - HDP2.6.1</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182748#M77853</link>
      <description>&lt;PRE&gt;Hi,
I am creating a table in Hive (LLAP) using an existing table. as follows:

CREATE TABLE test_druid STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' 
TBLPROPERTIES (   "druid.datasource" = "test_druid",   "druid.segment.granularity" = "MONTH",   "druid.query.granularity" = "DAY") 
as  select cast(trans_date as timestamp) as `__time` , col1, col2, col3 from testdb.test_hive_Table where to_date(trans_Date) &amp;gt;= '2018-01-01';

I am getting following error. No where i mentioned the druid storage as "/druid/segments". I dont know from where it is picking it. 

In Ambari, i have set druid.storage.storageDirectory=/user/druid/data. Not sure what is causing the issue.
Please help.

	Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/druid/segments/ff7e385a0fbf4d84973be64d88f31c02":hdfs:hdfs:drwxr-xr-x
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
        at org.apache.hadoop.hdfs.server.namenode.FS

&amp;lt;br&amp;gt;&lt;/PRE&gt;</description>
      <pubDate>Fri, 16 Sep 2022 13:10:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182748#M77853</guid>
      <dc:creator>lnc_adoni</dc:creator>
      <dc:date>2022-09-16T13:10:08Z</dc:date>
    </item>
    <item>
      <title>Re: Druid on Hive LLAP - HDP2.6.1</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182749#M77854</link>
      <description>&lt;P&gt;It is coming from Hive conf as default. You can set it from Ambari to make it global for all sessions &lt;/P&gt;&lt;PRE&gt;hive.druid.storage.storageDirectory&lt;/PRE&gt;&lt;P&gt;&lt;A href="https://github.com/b-slim/hive/blob/f8bc4868eced2ca83113579b626e279bbe6d5b13/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java#L2626" target="_blank"&gt;https://github.com/b-slim/hive/blob/f8bc4868eced2ca83113579b626e279bbe6d5b13/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java#L2626&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 02 May 2018 21:21:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182749#M77854</guid>
      <dc:creator>sbouguerra</dc:creator>
      <dc:date>2018-05-02T21:21:32Z</dc:date>
    </item>
    <item>
      <title>Re: Druid on Hive LLAP - HDP2.6.1</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182750#M77855</link>
      <description>&lt;P&gt;Thank Slim, that worked.&lt;/P&gt;</description>
      <pubDate>Thu, 03 May 2018 19:34:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Druid-on-Hive-LLAP-HDP2-6-1/m-p/182750#M77855</guid>
      <dc:creator>lnc_adoni</dc:creator>
      <dc:date>2018-05-03T19:34:59Z</dc:date>
    </item>
  </channel>
</rss>

